Test Report: Docker_Linux_containerd_arm64 21997

                    
                      f52e7af1cf54d5c1b3af81f5f4f56bb8b0b6d6f9:2025-12-01:42595
                    
                

Test fail (25/321)

Order failed test Duration
171 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy 505.96
173 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart 368.39
175 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods 2.17
185 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd 2.38
186 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly 2.28
187 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig 737.97
188 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth 2.25
191 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService 0.07
194 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd 1.71
197 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd 3.19
201 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect 2.43
203 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim 241.66
213 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels 1.69
219 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel 0.54
222 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup 0.18
223 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect 107.55
228 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp 0.07
229 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List 0.27
230 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput 0.27
231 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS 0.27
232 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format 0.26
233 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL 0.27
237 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port 2.23
358 TestKubernetesUpgrade 794.27
486 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 7200.175
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy (505.96s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
E1201 19:19:27.087986    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:19:54.803626    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:46.979914    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:46.986426    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:46.997961    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:47.019488    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:47.060957    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:47.142611    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:47.304294    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:47.625963    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:48.268113    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:49.549649    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:52.111027    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:21:57.233193    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:22:07.475388    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:22:27.956775    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:23:08.918972    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:24:27.088111    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:24:30.840461    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 109 (8m24.499053135s)

                                                
                                                
-- stdout --
	* [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	* Using Docker driver with root privileges
	* Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Found network options:
	  - HTTP_PROXY=localhost:35755
	* Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Local proxy ignored: not passing HTTP_PROXY=localhost:35755 to docker env.
	! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-428744 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-428744 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.00013097s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000176325s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000176325s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:2241: failed minikube start. args "out/minikube-linux-arm64 start -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0": exit status 109
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 6 (300.492043ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1201 19:26:05.559296   48511 status.go:458] kubeconfig endpoint: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:247: status error: exit status 6 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /usr/share/ca-certificates/43052.pem                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /etc/test/nested/copy/4305/hosts                                                                                                 │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image save kicbase/echo-server:functional-019259 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ update-context │ functional-019259 update-context --alsologtostderr -v=2                                                                                                         │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ update-context │ functional-019259 update-context --alsologtostderr -v=2                                                                                                         │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image rm kicbase/echo-server:functional-019259 --alsologtostderr                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image save --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format yaml --alsologtostderr                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format short --alsologtostderr                                                                                                     │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format json --alsologtostderr                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format table --alsologtostderr                                                                                                     │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh pgrep buildkitd                                                                                                                           │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image          │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                          │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete         │ -p functional-019259                                                                                                                                            │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start          │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:17:40
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:17:40.795651   42501 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:17:40.795788   42501 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:17:40.795792   42501 out.go:374] Setting ErrFile to fd 2...
	I1201 19:17:40.795796   42501 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:17:40.796215   42501 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:17:40.796858   42501 out.go:368] Setting JSON to false
	I1201 19:17:40.798379   42501 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":3612,"bootTime":1764613049,"procs":153,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:17:40.798507   42501 start.go:143] virtualization:  
	I1201 19:17:40.802919   42501 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:17:40.807692   42501 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:17:40.807760   42501 notify.go:221] Checking for updates...
	I1201 19:17:40.814670   42501 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:17:40.817994   42501 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:17:40.821379   42501 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:17:40.824576   42501 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:17:40.827737   42501 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:17:40.831215   42501 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:17:40.863683   42501 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:17:40.863787   42501 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:17:40.925169   42501 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:42 SystemTime:2025-12-01 19:17:40.914937852 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:17:40.925260   42501 docker.go:319] overlay module found
	I1201 19:17:40.928482   42501 out.go:179] * Using the docker driver based on user configuration
	I1201 19:17:40.931575   42501 start.go:309] selected driver: docker
	I1201 19:17:40.931584   42501 start.go:927] validating driver "docker" against <nil>
	I1201 19:17:40.931595   42501 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:17:40.932313   42501 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:17:40.986812   42501 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:42 SystemTime:2025-12-01 19:17:40.977848412 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:17:40.986965   42501 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1201 19:17:40.987187   42501 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1201 19:17:40.990095   42501 out.go:179] * Using Docker driver with root privileges
	I1201 19:17:40.993092   42501 cni.go:84] Creating CNI manager for ""
	I1201 19:17:40.993158   42501 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:17:40.993165   42501 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1201 19:17:40.993267   42501 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP
: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:17:40.996465   42501 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:17:40.999343   42501 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:17:41.002395   42501 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:17:41.005474   42501 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:17:41.005539   42501 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:17:41.024875   42501 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:17:41.024887   42501 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:17:41.058710   42501 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:17:41.255796   42501 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:17:41.256024   42501 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256124   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:17:41.256138   42501 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.63µs
	I1201 19:17:41.256151   42501 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:17:41.256162   42501 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256173   42501 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:17:41.256197   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:17:41.256202   42501 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 41.839µs
	I1201 19:17:41.256200   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json: {Name:mk36dc86e76691dbdae0e327f196c4488b2d3a57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:41.256206   42501 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:17:41.256215   42501 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256287   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:17:41.256291   42501 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 76.573µs
	I1201 19:17:41.256296   42501 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:17:41.256305   42501 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256332   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:17:41.256336   42501 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 32.173µs
	I1201 19:17:41.256340   42501 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:17:41.256352   42501 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:17:41.256348   42501 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256372   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:17:41.256374   42501 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256376   42501 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 28.661µs
	I1201 19:17:41.256381   42501 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:17:41.256389   42501 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256410   42501 start.go:364] duration metric: took 28.029µs to acquireMachinesLock for "functional-428744"
	I1201 19:17:41.256422   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:17:41.256426   42501 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 38.5µs
	I1201 19:17:41.256430   42501 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:17:41.256442   42501 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256425   42501 start.go:93] Provisioning new machine with config: &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: AP
IServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNS
Log:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 19:17:41.256480   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:17:41.256484   42501 start.go:125] createHost starting for "" (driver="docker")
	I1201 19:17:41.256490   42501 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 52.514µs
	I1201 19:17:41.256494   42501 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:17:41.256502   42501 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:17:41.256530   42501 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:17:41.256533   42501 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 32.805µs
	I1201 19:17:41.256538   42501 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:17:41.256542   42501 cache.go:87] Successfully saved all images to host disk.
	I1201 19:17:41.261741   42501 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	W1201 19:17:41.262045   42501 out.go:285] ! Local proxy ignored: not passing HTTP_PROXY=localhost:35755 to docker env.
	I1201 19:17:41.262120   42501 start.go:159] libmachine.API.Create for "functional-428744" (driver="docker")
	I1201 19:17:41.262142   42501 client.go:173] LocalClient.Create starting
	I1201 19:17:41.262241   42501 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem
	I1201 19:17:41.262273   42501 main.go:143] libmachine: Decoding PEM data...
	I1201 19:17:41.262286   42501 main.go:143] libmachine: Parsing certificate...
	I1201 19:17:41.262350   42501 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem
	I1201 19:17:41.262366   42501 main.go:143] libmachine: Decoding PEM data...
	I1201 19:17:41.262377   42501 main.go:143] libmachine: Parsing certificate...
	I1201 19:17:41.262740   42501 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1201 19:17:41.278827   42501 cli_runner.go:211] docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1201 19:17:41.278910   42501 network_create.go:284] running [docker network inspect functional-428744] to gather additional debugging logs...
	I1201 19:17:41.278925   42501 cli_runner.go:164] Run: docker network inspect functional-428744
	W1201 19:17:41.299667   42501 cli_runner.go:211] docker network inspect functional-428744 returned with exit code 1
	I1201 19:17:41.299688   42501 network_create.go:287] error running [docker network inspect functional-428744]: docker network inspect functional-428744: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network functional-428744 not found
	I1201 19:17:41.299699   42501 network_create.go:289] output of [docker network inspect functional-428744]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network functional-428744 not found
	
	** /stderr **
	I1201 19:17:41.299801   42501 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:17:41.318389   42501 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x400197ef90}
	I1201 19:17:41.318429   42501 network_create.go:124] attempt to create docker network functional-428744 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1201 19:17:41.318488   42501 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=functional-428744 functional-428744
	I1201 19:17:41.377481   42501 network_create.go:108] docker network functional-428744 192.168.49.0/24 created
	I1201 19:17:41.377527   42501 kic.go:121] calculated static IP "192.168.49.2" for the "functional-428744" container
	I1201 19:17:41.377599   42501 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1201 19:17:41.393797   42501 cli_runner.go:164] Run: docker volume create functional-428744 --label name.minikube.sigs.k8s.io=functional-428744 --label created_by.minikube.sigs.k8s.io=true
	I1201 19:17:41.412762   42501 oci.go:103] Successfully created a docker volume functional-428744
	I1201 19:17:41.412831   42501 cli_runner.go:164] Run: docker run --rm --name functional-428744-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-428744 --entrypoint /usr/bin/test -v functional-428744:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -d /var/lib
	I1201 19:17:41.975193   42501 oci.go:107] Successfully prepared a docker volume functional-428744
	I1201 19:17:41.975249   42501 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	W1201 19:17:41.975381   42501 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1201 19:17:41.975486   42501 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1201 19:17:42.043513   42501 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname functional-428744 --name functional-428744 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-428744 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=functional-428744 --network functional-428744 --ip 192.168.49.2 --volume functional-428744:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8441 --publish=127.0.0.1::8441 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b
	I1201 19:17:42.401171   42501 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Running}}
	I1201 19:17:42.423931   42501 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:17:42.453347   42501 cli_runner.go:164] Run: docker exec functional-428744 stat /var/lib/dpkg/alternatives/iptables
	I1201 19:17:42.506911   42501 oci.go:144] the created container "functional-428744" has a running status.
	I1201 19:17:42.506930   42501 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa...
	I1201 19:17:42.709259   42501 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1201 19:17:42.734641   42501 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:17:42.754124   42501 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1201 19:17:42.754135   42501 kic_runner.go:114] Args: [docker exec --privileged functional-428744 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1201 19:17:42.811433   42501 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:17:42.837698   42501 machine.go:94] provisionDockerMachine start ...
	I1201 19:17:42.837800   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:42.868065   42501 main.go:143] libmachine: Using SSH client type: native
	I1201 19:17:42.868392   42501 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:17:42.868399   42501 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:17:42.868975   42501 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:49900->127.0.0.1:32788: read: connection reset by peer
	I1201 19:17:46.021213   42501 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:17:46.021228   42501 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:17:46.021298   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:46.039633   42501 main.go:143] libmachine: Using SSH client type: native
	I1201 19:17:46.039954   42501 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:17:46.039963   42501 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:17:46.198997   42501 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:17:46.199072   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:46.216620   42501 main.go:143] libmachine: Using SSH client type: native
	I1201 19:17:46.216937   42501 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:17:46.216950   42501 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:17:46.365622   42501 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:17:46.365638   42501 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:17:46.365670   42501 ubuntu.go:190] setting up certificates
	I1201 19:17:46.365677   42501 provision.go:84] configureAuth start
	I1201 19:17:46.365739   42501 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:17:46.384451   42501 provision.go:143] copyHostCerts
	I1201 19:17:46.384514   42501 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:17:46.384522   42501 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:17:46.384600   42501 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:17:46.384695   42501 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:17:46.384699   42501 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:17:46.384723   42501 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:17:46.384785   42501 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:17:46.384788   42501 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:17:46.384810   42501 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:17:46.384863   42501 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:17:46.545818   42501 provision.go:177] copyRemoteCerts
	I1201 19:17:46.545871   42501 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:17:46.545911   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:46.566878   42501 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:17:46.669055   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:17:46.686968   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:17:46.704991   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1201 19:17:46.722139   42501 provision.go:87] duration metric: took 356.440025ms to configureAuth
	I1201 19:17:46.722156   42501 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:17:46.722340   42501 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:17:46.722345   42501 machine.go:97] duration metric: took 3.884637267s to provisionDockerMachine
	I1201 19:17:46.722357   42501 client.go:176] duration metric: took 5.460204261s to LocalClient.Create
	I1201 19:17:46.722370   42501 start.go:167] duration metric: took 5.460252623s to libmachine.API.Create "functional-428744"
	I1201 19:17:46.722376   42501 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:17:46.722385   42501 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:17:46.722442   42501 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:17:46.722491   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:46.740164   42501 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:17:46.850118   42501 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:17:46.853621   42501 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:17:46.853639   42501 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:17:46.853650   42501 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:17:46.853706   42501 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:17:46.853788   42501 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:17:46.853879   42501 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:17:46.853924   42501 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:17:46.861782   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:17:46.879952   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:17:46.897587   42501 start.go:296] duration metric: took 175.198342ms for postStartSetup
	I1201 19:17:46.897951   42501 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:17:46.915022   42501 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:17:46.915289   42501 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:17:46.915329   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:46.932336   42501 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:17:47.034649   42501 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:17:47.039853   42501 start.go:128] duration metric: took 5.783354976s to createHost
	I1201 19:17:47.039870   42501 start.go:83] releasing machines lock for "functional-428744", held for 5.783453318s
	I1201 19:17:47.039949   42501 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:17:47.063565   42501 out.go:179] * Found network options:
	I1201 19:17:47.066378   42501 out.go:179]   - HTTP_PROXY=localhost:35755
	W1201 19:17:47.069440   42501 out.go:285] ! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	I1201 19:17:47.072453   42501 out.go:179] * Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	I1201 19:17:47.075420   42501 ssh_runner.go:195] Run: cat /version.json
	I1201 19:17:47.075480   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:47.075518   42501 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:17:47.075572   42501 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:17:47.096537   42501 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:17:47.097632   42501 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:17:47.197731   42501 ssh_runner.go:195] Run: systemctl --version
	I1201 19:17:47.295671   42501 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 19:17:47.300189   42501 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:17:47.300250   42501 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:17:47.327264   42501 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1201 19:17:47.327278   42501 start.go:496] detecting cgroup driver to use...
	I1201 19:17:47.327308   42501 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:17:47.327368   42501 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:17:47.342418   42501 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:17:47.355549   42501 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:17:47.355615   42501 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:17:47.374124   42501 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:17:47.393422   42501 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:17:47.526646   42501 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:17:47.648774   42501 docker.go:234] disabling docker service ...
	I1201 19:17:47.648840   42501 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:17:47.669904   42501 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:17:47.683054   42501 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:17:47.810306   42501 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:17:47.928288   42501 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:17:47.941584   42501 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:17:47.955446   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:17:47.964530   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:17:47.973450   42501 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:17:47.973535   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:17:47.982691   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:17:47.991975   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:17:48.001413   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:17:48.010820   42501 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:17:48.018939   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:17:48.027797   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:17:48.036839   42501 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:17:48.049533   42501 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:17:48.058111   42501 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:17:48.067031   42501 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:17:48.195270   42501 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:17:48.295283   42501 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:17:48.295339   42501 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:17:48.299806   42501 start.go:564] Will wait 60s for crictl version
	I1201 19:17:48.299872   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.303811   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:17:48.329647   42501 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:17:48.329712   42501 ssh_runner.go:195] Run: containerd --version
	I1201 19:17:48.349260   42501 ssh_runner.go:195] Run: containerd --version
	I1201 19:17:48.376822   42501 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:17:48.379795   42501 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:17:48.396068   42501 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:17:48.400054   42501 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1201 19:17:48.409662   42501 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:17:48.409755   42501 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:17:48.409805   42501 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:17:48.433667   42501 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1201 19:17:48.433681   42501 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.6.5-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1201 19:17:48.433728   42501 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:48.433941   42501 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:48.434028   42501 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:48.434123   42501 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:48.434223   42501 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:48.434307   42501 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1201 19:17:48.434388   42501 image.go:138] retrieving image: registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:48.434474   42501 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:48.435479   42501 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:48.435888   42501 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:48.436018   42501 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:48.436133   42501 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:48.436243   42501 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:48.436494   42501 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:48.436623   42501 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1201 19:17:48.436744   42501 image.go:181] daemon lookup for registry.k8s.io/etcd:3.6.5-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:48.833940   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1201 19:17:48.834000   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:48.850210   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.6.5-0" and sha "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42"
	I1201 19:17:48.850268   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:48.854108   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1201 19:17:48.854164   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:48.860713   42501 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1201 19:17:48.860746   42501 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:48.860791   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.868342   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1201 19:17:48.868405   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:48.872551   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1201 19:17:48.872612   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1201 19:17:48.880583   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1201 19:17:48.880639   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:48.887422   42501 cache_images.go:118] "registry.k8s.io/etcd:3.6.5-0" needs transfer: "registry.k8s.io/etcd:3.6.5-0" does not exist at hash "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42" in container runtime
	I1201 19:17:48.887452   42501 cri.go:218] Removing image: registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:48.887499   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.903727   42501 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1201 19:17:48.903759   42501 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:48.903805   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.903878   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:48.914214   42501 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1201 19:17:48.914247   42501 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:48.914293   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.928976   42501 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1201 19:17:48.929009   42501 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1201 19:17:48.929053   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.932901   42501 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1201 19:17:48.932934   42501 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:48.932979   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:48.933044   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:48.957631   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:48.957696   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:48.957751   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1201 19:17:48.957812   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:48.978974   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:48.979048   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:49.007501   42501 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1201 19:17:49.007559   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:49.057111   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 19:17:49.057190   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:49.057257   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:49.057338   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1201 19:17:49.076608   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1201 19:17:49.076670   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:49.084671   42501 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1201 19:17:49.084705   42501 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:49.084750   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:49.169242   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1201 19:17:49.169325   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1201 19:17:49.169398   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1201 19:17:49.169471   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 19:17:49.169558   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 19:17:49.169630   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1201 19:17:49.169684   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0
	I1201 19:17:49.169730   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0
	I1201 19:17:49.169794   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:49.237299   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1201 19:17:49.237388   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1201 19:17:49.237458   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1201 19:17:49.237518   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1201 19:17:49.237574   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1201 19:17:49.237585   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1201 19:17:49.278815   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1201 19:17:49.278904   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1201 19:17:49.278968   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1201 19:17:49.279009   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1201 19:17:49.287243   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.6.5-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.6.5-0': No such file or directory
	I1201 19:17:49.287290   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 --> /var/lib/minikube/images/etcd_3.6.5-0 (21148160 bytes)
	I1201 19:17:49.287368   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:49.287424   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1201 19:17:49.287434   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1201 19:17:49.287473   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1201 19:17:49.287480   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1201 19:17:49.325605   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1201 19:17:49.325630   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1201 19:17:49.325673   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1201 19:17:49.325682   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1201 19:17:49.370472   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 19:17:49.371338   42501 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1201 19:17:49.371391   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	W1201 19:17:49.686756   42501 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1201 19:17:49.686883   42501 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1201 19:17:49.686936   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:49.715256   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1201 19:17:49.715673   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1201 19:17:49.802690   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	I1201 19:17:49.802731   42501 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1201 19:17:49.802759   42501 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:49.802807   42501 ssh_runner.go:195] Run: which crictl
	I1201 19:17:49.802867   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1201 19:17:49.802891   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	I1201 19:17:49.806289   42501 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1201 19:17:49.806357   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1201 19:17:49.868317   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:51.074640   42501 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: (1.268258674s)
	I1201 19:17:51.074659   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1201 19:17:51.074659   42501 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.206323311s)
	I1201 19:17:51.074684   42501 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1201 19:17:51.074720   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:51.074725   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1201 19:17:52.050134   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1201 19:17:52.050160   42501 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.6.5-0
	I1201 19:17:52.050218   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0
	I1201 19:17:52.050227   42501 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:17:53.406997   42501 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0: (1.356756377s)
	I1201 19:17:53.407012   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 from cache
	I1201 19:17:53.407027   42501 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1201 19:17:53.407073   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1201 19:17:53.407129   42501 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.356891512s)
	I1201 19:17:53.407151   42501 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1201 19:17:53.407221   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1201 19:17:54.359757   42501 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1201 19:17:54.359759   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1201 19:17:54.359780   42501 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1201 19:17:54.359781   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1201 19:17:54.359822   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1201 19:17:55.332537   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1201 19:17:55.332562   42501 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1201 19:17:55.332610   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1201 19:17:56.363584   42501 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (1.030952436s)
	I1201 19:17:56.363601   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1201 19:17:56.363627   42501 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1201 19:17:56.363677   42501 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1201 19:17:56.703938   42501 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1201 19:17:56.703978   42501 cache_images.go:125] Successfully loaded all cached images
	I1201 19:17:56.703982   42501 cache_images.go:94] duration metric: took 8.270291268s to LoadCachedImages
	I1201 19:17:56.703994   42501 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:17:56.704095   42501 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:17:56.704164   42501 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:17:56.728810   42501 cni.go:84] Creating CNI manager for ""
	I1201 19:17:56.728830   42501 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:17:56.728853   42501 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:17:56.728875   42501 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:17:56.728999   42501 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:17:56.729073   42501 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:17:56.737296   42501 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1201 19:17:56.737363   42501 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:17:56.745368   42501 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1201 19:17:56.745392   42501 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1201 19:17:56.745433   42501 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:17:56.745471   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1201 19:17:56.745374   42501 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1201 19:17:56.745549   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1201 19:17:56.761793   42501 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1201 19:17:56.761814   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1201 19:17:56.761822   42501 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1201 19:17:56.761861   42501 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1201 19:17:56.761869   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1201 19:17:56.795807   42501 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1201 19:17:56.795833   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1201 19:17:57.551549   42501 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:17:57.564111   42501 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:17:57.578870   42501 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:17:57.592926   42501 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1201 19:17:57.606647   42501 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:17:57.610266   42501 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1201 19:17:57.620028   42501 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:17:57.736624   42501 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:17:57.754907   42501 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:17:57.754917   42501 certs.go:195] generating shared ca certs ...
	I1201 19:17:57.754933   42501 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:57.755064   42501 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:17:57.755105   42501 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:17:57.755116   42501 certs.go:257] generating profile certs ...
	I1201 19:17:57.755180   42501 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:17:57.755193   42501 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt with IP's: []
	I1201 19:17:58.285091   42501 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt ...
	I1201 19:17:58.285107   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: {Name:mk4125e51fcde1a12dfbc3371f2c3c2d9ace2a92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:58.285301   42501 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key ...
	I1201 19:17:58.285307   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key: {Name:mk28fb801c4f5bedba2a0b36654671ad426b602a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:58.285403   42501 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:17:58.285414   42501 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt.910e2deb with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1201 19:17:58.393182   42501 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt.910e2deb ...
	I1201 19:17:58.393195   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt.910e2deb: {Name:mkc2fd9dae83de1ddd966cc88540933bcaf4bb23 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:58.393357   42501 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb ...
	I1201 19:17:58.393364   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb: {Name:mk366f7903020191b512ede05c723f216665e18b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:58.393444   42501 certs.go:382] copying /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt.910e2deb -> /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt
	I1201 19:17:58.393542   42501 certs.go:386] copying /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb -> /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key
	I1201 19:17:58.393596   42501 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:17:58.393607   42501 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt with IP's: []
	I1201 19:17:58.621478   42501 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt ...
	I1201 19:17:58.621499   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt: {Name:mkbb5626c5497aed7c2a3f9dba2c0d2539c2b74c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:58.621662   42501 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key ...
	I1201 19:17:58.621669   42501 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key: {Name:mked6cada6737b61749efaada82ba42b0f1d4726 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:17:58.621846   42501 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:17:58.621886   42501 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:17:58.621899   42501 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:17:58.621924   42501 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:17:58.621946   42501 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:17:58.621968   42501 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:17:58.622009   42501 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:17:58.622570   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:17:58.639969   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:17:58.656837   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:17:58.675924   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:17:58.694090   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:17:58.712147   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:17:58.730816   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:17:58.748668   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:17:58.765611   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:17:58.782440   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:17:58.799514   42501 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:17:58.816027   42501 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:17:58.828524   42501 ssh_runner.go:195] Run: openssl version
	I1201 19:17:58.834632   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:17:58.842479   42501 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:17:58.845849   42501 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:17:58.845899   42501 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:17:58.886457   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:17:58.894894   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:17:58.903138   42501 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:17:58.907005   42501 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:17:58.907059   42501 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:17:58.948042   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:17:58.956178   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:17:58.963941   42501 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:17:58.967549   42501 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:17:58.967602   42501 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:17:59.008445   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:17:59.016620   42501 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:17:59.020182   42501 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1201 19:17:59.020225   42501 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:17:59.020297   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:17:59.020353   42501 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:17:59.056021   42501 cri.go:89] found id: ""
	I1201 19:17:59.056087   42501 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:17:59.063935   42501 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:17:59.071607   42501 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:17:59.071659   42501 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:17:59.079216   42501 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:17:59.079227   42501 kubeadm.go:158] found existing configuration files:
	
	I1201 19:17:59.079274   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:17:59.086684   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:17:59.086740   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:17:59.093801   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:17:59.101118   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:17:59.101183   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:17:59.108432   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:17:59.116024   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:17:59.116079   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:17:59.123418   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:17:59.131115   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:17:59.131172   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:17:59.138708   42501 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:17:59.179598   42501 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:17:59.179712   42501 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:17:59.254274   42501 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:17:59.254339   42501 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:17:59.254373   42501 kubeadm.go:319] OS: Linux
	I1201 19:17:59.254420   42501 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:17:59.254467   42501 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:17:59.254512   42501 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:17:59.254559   42501 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:17:59.254608   42501 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:17:59.254654   42501 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:17:59.254697   42501 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:17:59.254744   42501 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:17:59.254789   42501 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:17:59.325799   42501 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:17:59.325925   42501 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:17:59.326024   42501 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:17:59.334094   42501 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:17:59.342677   42501 out.go:252]   - Generating certificates and keys ...
	I1201 19:17:59.342772   42501 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:17:59.342852   42501 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:17:59.720671   42501 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1201 19:18:00.200889   42501 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1201 19:18:00.260186   42501 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1201 19:18:00.623836   42501 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1201 19:18:00.979513   42501 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1201 19:18:00.979855   42501 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [functional-428744 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1201 19:18:01.069998   42501 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1201 19:18:01.070151   42501 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [functional-428744 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1201 19:18:01.302743   42501 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1201 19:18:01.772557   42501 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1201 19:18:01.982514   42501 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1201 19:18:01.982802   42501 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:18:02.042458   42501 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:18:02.126604   42501 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:18:02.415467   42501 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:18:02.808797   42501 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:18:02.922424   42501 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:18:02.923017   42501 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:18:02.925909   42501 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:18:02.934650   42501 out.go:252]   - Booting up control plane ...
	I1201 19:18:02.934764   42501 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:18:02.934850   42501 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:18:02.934935   42501 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:18:02.953544   42501 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:18:02.953650   42501 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:18:02.962916   42501 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:18:02.964359   42501 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:18:02.964424   42501 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:18:03.124615   42501 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:18:03.124749   42501 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:22:03.124338   42501 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.00013097s
	I1201 19:22:03.124361   42501 kubeadm.go:319] 
	I1201 19:22:03.124414   42501 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:22:03.124455   42501 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:22:03.124600   42501 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:22:03.124617   42501 kubeadm.go:319] 
	I1201 19:22:03.124738   42501 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:22:03.124770   42501 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:22:03.124798   42501 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:22:03.124801   42501 kubeadm.go:319] 
	I1201 19:22:03.128350   42501 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:22:03.128765   42501 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:22:03.128873   42501 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:22:03.129135   42501 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 19:22:03.129138   42501 kubeadm.go:319] 
	I1201 19:22:03.129206   42501 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1201 19:22:03.129323   42501 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-428744 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-428744 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.00013097s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1201 19:22:03.129413   42501 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:22:03.544717   42501 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:22:03.558973   42501 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:22:03.559039   42501 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:22:03.569188   42501 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:22:03.569199   42501 kubeadm.go:158] found existing configuration files:
	
	I1201 19:22:03.569254   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:22:03.577468   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:22:03.577612   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:22:03.585426   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:22:03.594178   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:22:03.594236   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:22:03.602310   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:22:03.610648   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:22:03.610704   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:22:03.618590   42501 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:22:03.626439   42501 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:22:03.626499   42501 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:22:03.634445   42501 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:22:03.750218   42501 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:22:03.750657   42501 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:22:03.817000   42501 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:26:04.793012   42501 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 19:26:04.793034   42501 kubeadm.go:319] 
	I1201 19:26:04.793150   42501 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 19:26:04.797914   42501 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:26:04.797973   42501 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:26:04.798097   42501 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:26:04.798165   42501 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:26:04.798203   42501 kubeadm.go:319] OS: Linux
	I1201 19:26:04.798255   42501 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:26:04.798311   42501 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:26:04.798369   42501 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:26:04.798413   42501 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:26:04.798470   42501 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:26:04.798519   42501 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:26:04.798563   42501 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:26:04.798616   42501 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:26:04.798665   42501 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:26:04.798756   42501 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:26:04.798885   42501 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:26:04.798969   42501 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:26:04.799030   42501 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:26:04.802109   42501 out.go:252]   - Generating certificates and keys ...
	I1201 19:26:04.802200   42501 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:26:04.802287   42501 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:26:04.802373   42501 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:26:04.802439   42501 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:26:04.802505   42501 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:26:04.802564   42501 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:26:04.802626   42501 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:26:04.802702   42501 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:26:04.802777   42501 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:26:04.802857   42501 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:26:04.802895   42501 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:26:04.802949   42501 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:26:04.803000   42501 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:26:04.803055   42501 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:26:04.803137   42501 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:26:04.803207   42501 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:26:04.803263   42501 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:26:04.803342   42501 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:26:04.803407   42501 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:26:04.808203   42501 out.go:252]   - Booting up control plane ...
	I1201 19:26:04.808294   42501 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:26:04.808371   42501 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:26:04.808433   42501 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:26:04.808530   42501 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:26:04.808619   42501 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:26:04.808716   42501 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:26:04.808794   42501 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:26:04.808830   42501 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:26:04.808951   42501 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:26:04.809049   42501 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:26:04.809108   42501 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000176325s
	I1201 19:26:04.809111   42501 kubeadm.go:319] 
	I1201 19:26:04.809163   42501 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:26:04.809192   42501 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:26:04.809289   42501 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:26:04.809292   42501 kubeadm.go:319] 
	I1201 19:26:04.809389   42501 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:26:04.809418   42501 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:26:04.809446   42501 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:26:04.809561   42501 kubeadm.go:403] duration metric: took 8m5.789323556s to StartCluster
	I1201 19:26:04.809592   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:26:04.809666   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:26:04.809690   42501 kubeadm.go:319] 
	I1201 19:26:04.834685   42501 cri.go:89] found id: ""
	I1201 19:26:04.834700   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.834708   42501 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:26:04.834714   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:26:04.834773   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:26:04.858223   42501 cri.go:89] found id: ""
	I1201 19:26:04.858237   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.858244   42501 logs.go:284] No container was found matching "etcd"
	I1201 19:26:04.858249   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:26:04.858308   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:26:04.883735   42501 cri.go:89] found id: ""
	I1201 19:26:04.883749   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.883756   42501 logs.go:284] No container was found matching "coredns"
	I1201 19:26:04.883762   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:26:04.883836   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:26:04.909868   42501 cri.go:89] found id: ""
	I1201 19:26:04.909882   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.909889   42501 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:26:04.909894   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:26:04.909951   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:26:04.934786   42501 cri.go:89] found id: ""
	I1201 19:26:04.934799   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.934807   42501 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:26:04.934812   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:26:04.934869   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:26:04.959097   42501 cri.go:89] found id: ""
	I1201 19:26:04.959111   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.959117   42501 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:26:04.959123   42501 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:26:04.959179   42501 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:26:04.986333   42501 cri.go:89] found id: ""
	I1201 19:26:04.986347   42501 logs.go:282] 0 containers: []
	W1201 19:26:04.986364   42501 logs.go:284] No container was found matching "kindnet"
	I1201 19:26:04.986374   42501 logs.go:123] Gathering logs for kubelet ...
	I1201 19:26:04.986384   42501 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:26:05.041981   42501 logs.go:123] Gathering logs for dmesg ...
	I1201 19:26:05.041999   42501 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:26:05.059956   42501 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:26:05.059974   42501 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:26:05.151818   42501 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:26:05.139324    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.140031    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.142218    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.143039    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.144888    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:26:05.139324    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.140031    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.142218    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.143039    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:05.144888    5379 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:26:05.151829   42501 logs.go:123] Gathering logs for containerd ...
	I1201 19:26:05.151839   42501 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:26:05.194397   42501 logs.go:123] Gathering logs for container status ...
	I1201 19:26:05.194416   42501 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 19:26:05.223142   42501 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000176325s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 19:26:05.223185   42501 out.go:285] * 
	W1201 19:26:05.223291   42501 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000176325s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:26:05.223351   42501 out.go:285] * 
	W1201 19:26:05.225623   42501 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:26:05.231663   42501 out.go:203] 
	W1201 19:26:05.234600   42501 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000176325s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:26:05.234641   42501 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 19:26:05.234661   42501 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 19:26:05.237767   42501 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:17:51 functional-428744 containerd[765]: time="2025-12-01T19:17:51.083498806Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-scheduler:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:52 functional-428744 containerd[765]: time="2025-12-01T19:17:52.036438400Z" level=info msg="No images store for sha256:5ed8f231f07481c657ad0e1d039921948e7abbc30ef6215465129012c4c4a508"
	Dec 01 19:17:52 functional-428744 containerd[765]: time="2025-12-01T19:17:52.038861258Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.0-beta.0\""
	Dec 01 19:17:52 functional-428744 containerd[765]: time="2025-12-01T19:17:52.051421520Z" level=info msg="ImageCreate event name:\"sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:52 functional-428744 containerd[765]: time="2025-12-01T19:17:52.052327287Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-controller-manager:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:53 functional-428744 containerd[765]: time="2025-12-01T19:17:53.398705326Z" level=info msg="No images store for sha256:89a52ae86f116708cd5ba0d54dfbf2ae3011f126ee9161c4afb19bf2a51ef285"
	Dec 01 19:17:53 functional-428744 containerd[765]: time="2025-12-01T19:17:53.400844164Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\""
	Dec 01 19:17:53 functional-428744 containerd[765]: time="2025-12-01T19:17:53.413858019Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:53 functional-428744 containerd[765]: time="2025-12-01T19:17:53.414719558Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/etcd:3.6.5-0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:54 functional-428744 containerd[765]: time="2025-12-01T19:17:54.349639760Z" level=info msg="No images store for sha256:5e4a4fe83792bf529a4e283e09069cf50cc9882d04168a33903ed6809a492e61"
	Dec 01 19:17:54 functional-428744 containerd[765]: time="2025-12-01T19:17:54.351965459Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\""
	Dec 01 19:17:54 functional-428744 containerd[765]: time="2025-12-01T19:17:54.360122306Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:54 functional-428744 containerd[765]: time="2025-12-01T19:17:54.360891084Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:55 functional-428744 containerd[765]: time="2025-12-01T19:17:55.324260194Z" level=info msg="No images store for sha256:eb9020767c0d3bbd754f3f52cbe4c8bdd935dd5862604d6dc0b1f10422189544"
	Dec 01 19:17:55 functional-428744 containerd[765]: time="2025-12-01T19:17:55.327152892Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.0-beta.0\""
	Dec 01 19:17:55 functional-428744 containerd[765]: time="2025-12-01T19:17:55.337284683Z" level=info msg="ImageCreate event name:\"sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:55 functional-428744 containerd[765]: time="2025-12-01T19:17:55.337700335Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-proxy:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.355234506Z" level=info msg="No images store for sha256:64f3fb0a3392f487dbd4300c920f76dc3de2961e11fd6bfbedc75c0d25b1954c"
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.357772527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.0-beta.0\""
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.367446729Z" level=info msg="ImageCreate event name:\"sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.368088143Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-apiserver:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.693514702Z" level=info msg="No images store for sha256:7475c7d18769df89a804d5bebf679dbf94886f3626f07a2be923beaa0cc7e5b0"
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.695828232Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/storage-provisioner:v5\""
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.704080106Z" level=info msg="ImageCreate event name:\"sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:17:56 functional-428744 containerd[765]: time="2025-12-01T19:17:56.704407443Z" level=info msg="ImageUpdate event name:\"gcr.io/k8s-minikube/storage-provisioner:v5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:26:06.193914    5504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:06.194461    5504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:06.196143    5504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:06.196684    5504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:26:06.198234    5504 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:26:06 up  1:08,  0 user,  load average: 0.17, 0.50, 0.78
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:26:02 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:26:03 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 318.
	Dec 01 19:26:03 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:03 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:03 functional-428744 kubelet[5305]: E1201 19:26:03.634308    5305 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:26:03 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:26:03 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:26:04 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 01 19:26:04 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:04 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:04 functional-428744 kubelet[5311]: E1201 19:26:04.382263    5311 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:26:04 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:26:04 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:26:05 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 01 19:26:05 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:05 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:05 functional-428744 kubelet[5383]: E1201 19:26:05.155844    5383 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:26:05 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:26:05 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:26:05 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 01 19:26:05 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:05 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:26:05 functional-428744 kubelet[5425]: E1201 19:26:05.886513    5425 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:26:05 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:26:05 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 6 (353.883418ms)

                                                
                                                
-- stdout --
	Stopped
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1201 19:26:06.693731   48727 status.go:458] kubeconfig endpoint: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:262: status error: exit status 6 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy (505.96s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart (368.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart
I1201 19:26:06.709765    4305 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-428744 --alsologtostderr -v=8
E1201 19:26:46.971957    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:27:14.682706    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:29:27.087476    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:30:50.165870    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:31:46.971503    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-428744 --alsologtostderr -v=8: exit status 80 (6m5.400438102s)

                                                
                                                
-- stdout --
	* [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:26:06.760311   48804 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:26:06.760471   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760480   48804 out.go:374] Setting ErrFile to fd 2...
	I1201 19:26:06.760485   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760749   48804 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:26:06.761114   48804 out.go:368] Setting JSON to false
	I1201 19:26:06.761974   48804 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4118,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:26:06.762048   48804 start.go:143] virtualization:  
	I1201 19:26:06.765446   48804 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:26:06.769259   48804 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:26:06.769379   48804 notify.go:221] Checking for updates...
	I1201 19:26:06.775400   48804 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:26:06.778339   48804 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:06.781100   48804 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:26:06.784047   48804 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:26:06.786945   48804 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:26:06.790355   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:06.790504   48804 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:26:06.817889   48804 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:26:06.818002   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.874928   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.865437959 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.875040   48804 docker.go:319] overlay module found
	I1201 19:26:06.878298   48804 out.go:179] * Using the docker driver based on existing profile
	I1201 19:26:06.881322   48804 start.go:309] selected driver: docker
	I1201 19:26:06.881345   48804 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.881455   48804 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:26:06.881703   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.946129   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.93658681 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.946541   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:06.946612   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:06.946692   48804 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.949952   48804 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:26:06.952666   48804 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:26:06.955511   48804 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:26:06.958482   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:06.958560   48804 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:26:06.978189   48804 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:26:06.978215   48804 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:26:07.013576   48804 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:26:07.245550   48804 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:26:07.245729   48804 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:26:07.245814   48804 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245902   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:26:07.245911   48804 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 111.155µs
	I1201 19:26:07.245925   48804 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:26:07.245935   48804 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245965   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:26:07.245971   48804 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 37.068µs
	I1201 19:26:07.245977   48804 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:26:07.245979   48804 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:26:07.245986   48804 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246018   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:26:07.246022   48804 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 37.371µs
	I1201 19:26:07.246020   48804 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246029   48804 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246041   48804 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246063   48804 start.go:364] duration metric: took 29.397µs to acquireMachinesLock for "functional-428744"
	I1201 19:26:07.246076   48804 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:26:07.246081   48804 fix.go:54] fixHost starting: 
	I1201 19:26:07.246083   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:26:07.246089   48804 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 51.212µs
	I1201 19:26:07.246094   48804 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246103   48804 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246129   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:26:07.246135   48804 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 32.744µs
	I1201 19:26:07.246145   48804 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246154   48804 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246179   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:26:07.246184   48804 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.013µs
	I1201 19:26:07.246189   48804 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:26:07.246197   48804 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246221   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:26:07.246225   48804 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 29.356µs
	I1201 19:26:07.246230   48804 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:26:07.246238   48804 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246268   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:26:07.246273   48804 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 35.526µs
	I1201 19:26:07.246278   48804 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:26:07.246288   48804 cache.go:87] Successfully saved all images to host disk.
	I1201 19:26:07.246352   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:07.263626   48804 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:26:07.263658   48804 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:26:07.267042   48804 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:26:07.267094   48804 machine.go:94] provisionDockerMachine start ...
	I1201 19:26:07.267191   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.284298   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.284633   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.284647   48804 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:26:07.445599   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.445668   48804 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:26:07.445742   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.466448   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.466762   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.466780   48804 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:26:07.626795   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.626872   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.646204   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.646540   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.646566   48804 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:26:07.797736   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:26:07.797765   48804 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:26:07.797791   48804 ubuntu.go:190] setting up certificates
	I1201 19:26:07.797801   48804 provision.go:84] configureAuth start
	I1201 19:26:07.797871   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:07.815670   48804 provision.go:143] copyHostCerts
	I1201 19:26:07.815726   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815768   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:26:07.815790   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815876   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:26:07.815970   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.815990   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:26:07.815998   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.816026   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:26:07.816080   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816100   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:26:07.816107   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816131   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:26:07.816190   48804 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:26:07.904001   48804 provision.go:177] copyRemoteCerts
	I1201 19:26:07.904069   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:26:07.904109   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.922469   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.029518   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1201 19:26:08.029579   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:26:08.047419   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1201 19:26:08.047495   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:26:08.069296   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1201 19:26:08.069377   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:26:08.088982   48804 provision.go:87] duration metric: took 291.155414ms to configureAuth
	I1201 19:26:08.089064   48804 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:26:08.089321   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:08.089350   48804 machine.go:97] duration metric: took 822.24428ms to provisionDockerMachine
	I1201 19:26:08.089385   48804 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:26:08.089416   48804 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:26:08.089542   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:26:08.089633   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.112132   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.217325   48804 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:26:08.220778   48804 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1201 19:26:08.220802   48804 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1201 19:26:08.220808   48804 command_runner.go:130] > VERSION_ID="12"
	I1201 19:26:08.220813   48804 command_runner.go:130] > VERSION="12 (bookworm)"
	I1201 19:26:08.220817   48804 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1201 19:26:08.220820   48804 command_runner.go:130] > ID=debian
	I1201 19:26:08.220825   48804 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1201 19:26:08.220831   48804 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1201 19:26:08.220837   48804 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1201 19:26:08.220885   48804 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:26:08.220907   48804 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:26:08.220919   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:26:08.220978   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:26:08.221055   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:26:08.221066   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /etc/ssl/certs/43052.pem
	I1201 19:26:08.221140   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:26:08.221148   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> /etc/test/nested/copy/4305/hosts
	I1201 19:26:08.221198   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:26:08.229002   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:08.246695   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:26:08.263789   48804 start.go:296] duration metric: took 174.371826ms for postStartSetup
	I1201 19:26:08.263869   48804 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:26:08.263931   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.281235   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.382466   48804 command_runner.go:130] > 12%
	I1201 19:26:08.382557   48804 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:26:08.386763   48804 command_runner.go:130] > 172G
	I1201 19:26:08.387182   48804 fix.go:56] duration metric: took 1.141096136s for fixHost
	I1201 19:26:08.387210   48804 start.go:83] releasing machines lock for "functional-428744", held for 1.141138241s
	I1201 19:26:08.387280   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:08.405649   48804 ssh_runner.go:195] Run: cat /version.json
	I1201 19:26:08.405673   48804 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:26:08.405720   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.405736   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.424898   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.435929   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.615638   48804 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1201 19:26:08.615700   48804 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1201 19:26:08.615817   48804 ssh_runner.go:195] Run: systemctl --version
	I1201 19:26:08.621830   48804 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1201 19:26:08.621881   48804 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1201 19:26:08.622279   48804 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1201 19:26:08.626405   48804 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1201 19:26:08.626689   48804 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:26:08.626779   48804 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:26:08.634801   48804 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:26:08.634864   48804 start.go:496] detecting cgroup driver to use...
	I1201 19:26:08.634909   48804 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:26:08.634995   48804 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:26:08.650643   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:26:08.663900   48804 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:26:08.663962   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:26:08.680016   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:26:08.693295   48804 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:26:08.807192   48804 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:26:08.949829   48804 docker.go:234] disabling docker service ...
	I1201 19:26:08.949910   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:26:08.965005   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:26:08.978389   48804 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:26:09.113220   48804 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:26:09.265765   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:26:09.280775   48804 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:26:09.295503   48804 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1201 19:26:09.296833   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:26:09.307263   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:26:09.316009   48804 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:26:09.316129   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:26:09.324849   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.333586   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:26:09.341989   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.350174   48804 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:26:09.358089   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:26:09.366694   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:26:09.375459   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:26:09.384162   48804 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:26:09.390646   48804 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1201 19:26:09.391441   48804 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:26:09.398673   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:09.519779   48804 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:26:09.650665   48804 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:26:09.650790   48804 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:26:09.655039   48804 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1201 19:26:09.655139   48804 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1201 19:26:09.655166   48804 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1201 19:26:09.655199   48804 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:09.655222   48804 command_runner.go:130] > Access: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655243   48804 command_runner.go:130] > Modify: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655266   48804 command_runner.go:130] > Change: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655294   48804 command_runner.go:130] >  Birth: -
	I1201 19:26:09.655330   48804 start.go:564] Will wait 60s for crictl version
	I1201 19:26:09.655409   48804 ssh_runner.go:195] Run: which crictl
	I1201 19:26:09.659043   48804 command_runner.go:130] > /usr/local/bin/crictl
	I1201 19:26:09.659221   48804 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:26:09.684907   48804 command_runner.go:130] > Version:  0.1.0
	I1201 19:26:09.684979   48804 command_runner.go:130] > RuntimeName:  containerd
	I1201 19:26:09.684999   48804 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1201 19:26:09.685021   48804 command_runner.go:130] > RuntimeApiVersion:  v1
	I1201 19:26:09.687516   48804 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:26:09.687623   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.708580   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.710309   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.728879   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.737012   48804 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:26:09.739912   48804 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:26:09.756533   48804 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:26:09.760816   48804 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1201 19:26:09.760978   48804 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:26:09.761088   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:09.761147   48804 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:26:09.788491   48804 command_runner.go:130] > {
	I1201 19:26:09.788509   48804 command_runner.go:130] >   "images":  [
	I1201 19:26:09.788514   48804 command_runner.go:130] >     {
	I1201 19:26:09.788524   48804 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1201 19:26:09.788529   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788534   48804 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1201 19:26:09.788538   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788542   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788546   48804 command_runner.go:130] >       "size":  "8032639",
	I1201 19:26:09.788552   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788556   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788560   48804 command_runner.go:130] >     },
	I1201 19:26:09.788563   48804 command_runner.go:130] >     {
	I1201 19:26:09.788570   48804 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1201 19:26:09.788573   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788578   48804 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1201 19:26:09.788582   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788586   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788598   48804 command_runner.go:130] >       "size":  "21166088",
	I1201 19:26:09.788603   48804 command_runner.go:130] >       "username":  "nonroot",
	I1201 19:26:09.788611   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788615   48804 command_runner.go:130] >     },
	I1201 19:26:09.788617   48804 command_runner.go:130] >     {
	I1201 19:26:09.788624   48804 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1201 19:26:09.788628   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788633   48804 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1201 19:26:09.788636   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788639   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788643   48804 command_runner.go:130] >       "size":  "21134420",
	I1201 19:26:09.788647   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788651   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788654   48804 command_runner.go:130] >       },
	I1201 19:26:09.788658   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788662   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788665   48804 command_runner.go:130] >     },
	I1201 19:26:09.788668   48804 command_runner.go:130] >     {
	I1201 19:26:09.788675   48804 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1201 19:26:09.788678   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788685   48804 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1201 19:26:09.788689   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788692   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788697   48804 command_runner.go:130] >       "size":  "24676285",
	I1201 19:26:09.788700   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788704   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788707   48804 command_runner.go:130] >       },
	I1201 19:26:09.788711   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788715   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788718   48804 command_runner.go:130] >     },
	I1201 19:26:09.788721   48804 command_runner.go:130] >     {
	I1201 19:26:09.788728   48804 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1201 19:26:09.788732   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788739   48804 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1201 19:26:09.788743   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788750   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788755   48804 command_runner.go:130] >       "size":  "20658969",
	I1201 19:26:09.788759   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788762   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788765   48804 command_runner.go:130] >       },
	I1201 19:26:09.788769   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788773   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788776   48804 command_runner.go:130] >     },
	I1201 19:26:09.788779   48804 command_runner.go:130] >     {
	I1201 19:26:09.788786   48804 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1201 19:26:09.788790   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788795   48804 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1201 19:26:09.788799   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788803   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788807   48804 command_runner.go:130] >       "size":  "22428165",
	I1201 19:26:09.788814   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788818   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788822   48804 command_runner.go:130] >     },
	I1201 19:26:09.788825   48804 command_runner.go:130] >     {
	I1201 19:26:09.788832   48804 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1201 19:26:09.788835   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788841   48804 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1201 19:26:09.788844   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788855   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788860   48804 command_runner.go:130] >       "size":  "15389290",
	I1201 19:26:09.788863   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788867   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788870   48804 command_runner.go:130] >       },
	I1201 19:26:09.788874   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788878   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788881   48804 command_runner.go:130] >     },
	I1201 19:26:09.788883   48804 command_runner.go:130] >     {
	I1201 19:26:09.788890   48804 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1201 19:26:09.788897   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788902   48804 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1201 19:26:09.788905   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788908   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788912   48804 command_runner.go:130] >       "size":  "265458",
	I1201 19:26:09.788920   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788924   48804 command_runner.go:130] >         "value":  "65535"
	I1201 19:26:09.788927   48804 command_runner.go:130] >       },
	I1201 19:26:09.788931   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788934   48804 command_runner.go:130] >       "pinned":  true
	I1201 19:26:09.788937   48804 command_runner.go:130] >     }
	I1201 19:26:09.788940   48804 command_runner.go:130] >   ]
	I1201 19:26:09.788943   48804 command_runner.go:130] > }
	I1201 19:26:09.791239   48804 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:26:09.791264   48804 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:26:09.791273   48804 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:26:09.791374   48804 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:26:09.791446   48804 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:26:09.822661   48804 command_runner.go:130] > {
	I1201 19:26:09.822679   48804 command_runner.go:130] >   "cniconfig": {
	I1201 19:26:09.822684   48804 command_runner.go:130] >     "Networks": [
	I1201 19:26:09.822688   48804 command_runner.go:130] >       {
	I1201 19:26:09.822694   48804 command_runner.go:130] >         "Config": {
	I1201 19:26:09.822699   48804 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1201 19:26:09.822704   48804 command_runner.go:130] >           "Name": "cni-loopback",
	I1201 19:26:09.822709   48804 command_runner.go:130] >           "Plugins": [
	I1201 19:26:09.822712   48804 command_runner.go:130] >             {
	I1201 19:26:09.822717   48804 command_runner.go:130] >               "Network": {
	I1201 19:26:09.822721   48804 command_runner.go:130] >                 "ipam": {},
	I1201 19:26:09.822726   48804 command_runner.go:130] >                 "type": "loopback"
	I1201 19:26:09.822730   48804 command_runner.go:130] >               },
	I1201 19:26:09.822735   48804 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1201 19:26:09.822738   48804 command_runner.go:130] >             }
	I1201 19:26:09.822741   48804 command_runner.go:130] >           ],
	I1201 19:26:09.822751   48804 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1201 19:26:09.822755   48804 command_runner.go:130] >         },
	I1201 19:26:09.822760   48804 command_runner.go:130] >         "IFName": "lo"
	I1201 19:26:09.822764   48804 command_runner.go:130] >       }
	I1201 19:26:09.822771   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822776   48804 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1201 19:26:09.822780   48804 command_runner.go:130] >     "PluginDirs": [
	I1201 19:26:09.822784   48804 command_runner.go:130] >       "/opt/cni/bin"
	I1201 19:26:09.822787   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822792   48804 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1201 19:26:09.822795   48804 command_runner.go:130] >     "Prefix": "eth"
	I1201 19:26:09.822798   48804 command_runner.go:130] >   },
	I1201 19:26:09.822801   48804 command_runner.go:130] >   "config": {
	I1201 19:26:09.822805   48804 command_runner.go:130] >     "cdiSpecDirs": [
	I1201 19:26:09.822809   48804 command_runner.go:130] >       "/etc/cdi",
	I1201 19:26:09.822813   48804 command_runner.go:130] >       "/var/run/cdi"
	I1201 19:26:09.822816   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822823   48804 command_runner.go:130] >     "cni": {
	I1201 19:26:09.822827   48804 command_runner.go:130] >       "binDir": "",
	I1201 19:26:09.822831   48804 command_runner.go:130] >       "binDirs": [
	I1201 19:26:09.822834   48804 command_runner.go:130] >         "/opt/cni/bin"
	I1201 19:26:09.822837   48804 command_runner.go:130] >       ],
	I1201 19:26:09.822842   48804 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1201 19:26:09.822846   48804 command_runner.go:130] >       "confTemplate": "",
	I1201 19:26:09.822849   48804 command_runner.go:130] >       "ipPref": "",
	I1201 19:26:09.822853   48804 command_runner.go:130] >       "maxConfNum": 1,
	I1201 19:26:09.822857   48804 command_runner.go:130] >       "setupSerially": false,
	I1201 19:26:09.822862   48804 command_runner.go:130] >       "useInternalLoopback": false
	I1201 19:26:09.822865   48804 command_runner.go:130] >     },
	I1201 19:26:09.822872   48804 command_runner.go:130] >     "containerd": {
	I1201 19:26:09.822876   48804 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1201 19:26:09.822881   48804 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1201 19:26:09.822886   48804 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1201 19:26:09.822892   48804 command_runner.go:130] >       "runtimes": {
	I1201 19:26:09.822896   48804 command_runner.go:130] >         "runc": {
	I1201 19:26:09.822901   48804 command_runner.go:130] >           "ContainerAnnotations": null,
	I1201 19:26:09.822905   48804 command_runner.go:130] >           "PodAnnotations": null,
	I1201 19:26:09.822914   48804 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1201 19:26:09.822919   48804 command_runner.go:130] >           "cgroupWritable": false,
	I1201 19:26:09.822923   48804 command_runner.go:130] >           "cniConfDir": "",
	I1201 19:26:09.822927   48804 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1201 19:26:09.822931   48804 command_runner.go:130] >           "io_type": "",
	I1201 19:26:09.822934   48804 command_runner.go:130] >           "options": {
	I1201 19:26:09.822939   48804 command_runner.go:130] >             "BinaryName": "",
	I1201 19:26:09.822943   48804 command_runner.go:130] >             "CriuImagePath": "",
	I1201 19:26:09.822947   48804 command_runner.go:130] >             "CriuWorkPath": "",
	I1201 19:26:09.822951   48804 command_runner.go:130] >             "IoGid": 0,
	I1201 19:26:09.822955   48804 command_runner.go:130] >             "IoUid": 0,
	I1201 19:26:09.822959   48804 command_runner.go:130] >             "NoNewKeyring": false,
	I1201 19:26:09.822963   48804 command_runner.go:130] >             "Root": "",
	I1201 19:26:09.822968   48804 command_runner.go:130] >             "ShimCgroup": "",
	I1201 19:26:09.822972   48804 command_runner.go:130] >             "SystemdCgroup": false
	I1201 19:26:09.822975   48804 command_runner.go:130] >           },
	I1201 19:26:09.822980   48804 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1201 19:26:09.822987   48804 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1201 19:26:09.822991   48804 command_runner.go:130] >           "runtimePath": "",
	I1201 19:26:09.822996   48804 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1201 19:26:09.823001   48804 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1201 19:26:09.823005   48804 command_runner.go:130] >           "snapshotter": ""
	I1201 19:26:09.823008   48804 command_runner.go:130] >         }
	I1201 19:26:09.823011   48804 command_runner.go:130] >       }
	I1201 19:26:09.823014   48804 command_runner.go:130] >     },
	I1201 19:26:09.823026   48804 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1201 19:26:09.823032   48804 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1201 19:26:09.823037   48804 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1201 19:26:09.823041   48804 command_runner.go:130] >     "disableApparmor": false,
	I1201 19:26:09.823045   48804 command_runner.go:130] >     "disableHugetlbController": true,
	I1201 19:26:09.823049   48804 command_runner.go:130] >     "disableProcMount": false,
	I1201 19:26:09.823054   48804 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1201 19:26:09.823058   48804 command_runner.go:130] >     "enableCDI": true,
	I1201 19:26:09.823068   48804 command_runner.go:130] >     "enableSelinux": false,
	I1201 19:26:09.823073   48804 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1201 19:26:09.823078   48804 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1201 19:26:09.823091   48804 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1201 19:26:09.823096   48804 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1201 19:26:09.823100   48804 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1201 19:26:09.823105   48804 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1201 19:26:09.823109   48804 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1201 19:26:09.823115   48804 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823119   48804 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1201 19:26:09.823125   48804 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823129   48804 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1201 19:26:09.823135   48804 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1201 19:26:09.823138   48804 command_runner.go:130] >   },
	I1201 19:26:09.823141   48804 command_runner.go:130] >   "features": {
	I1201 19:26:09.823145   48804 command_runner.go:130] >     "supplemental_groups_policy": true
	I1201 19:26:09.823148   48804 command_runner.go:130] >   },
	I1201 19:26:09.823152   48804 command_runner.go:130] >   "golang": "go1.24.9",
	I1201 19:26:09.823162   48804 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823173   48804 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823176   48804 command_runner.go:130] >   "runtimeHandlers": [
	I1201 19:26:09.823179   48804 command_runner.go:130] >     {
	I1201 19:26:09.823183   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823188   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823194   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823197   48804 command_runner.go:130] >       }
	I1201 19:26:09.823199   48804 command_runner.go:130] >     },
	I1201 19:26:09.823202   48804 command_runner.go:130] >     {
	I1201 19:26:09.823206   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823211   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823215   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823218   48804 command_runner.go:130] >       },
	I1201 19:26:09.823221   48804 command_runner.go:130] >       "name": "runc"
	I1201 19:26:09.823228   48804 command_runner.go:130] >     }
	I1201 19:26:09.823231   48804 command_runner.go:130] >   ],
	I1201 19:26:09.823235   48804 command_runner.go:130] >   "status": {
	I1201 19:26:09.823239   48804 command_runner.go:130] >     "conditions": [
	I1201 19:26:09.823242   48804 command_runner.go:130] >       {
	I1201 19:26:09.823245   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823249   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823252   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823257   48804 command_runner.go:130] >         "type": "RuntimeReady"
	I1201 19:26:09.823260   48804 command_runner.go:130] >       },
	I1201 19:26:09.823263   48804 command_runner.go:130] >       {
	I1201 19:26:09.823269   48804 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1201 19:26:09.823274   48804 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1201 19:26:09.823277   48804 command_runner.go:130] >         "status": false,
	I1201 19:26:09.823282   48804 command_runner.go:130] >         "type": "NetworkReady"
	I1201 19:26:09.823285   48804 command_runner.go:130] >       },
	I1201 19:26:09.823288   48804 command_runner.go:130] >       {
	I1201 19:26:09.823292   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823295   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823299   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823305   48804 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1201 19:26:09.823308   48804 command_runner.go:130] >       }
	I1201 19:26:09.823310   48804 command_runner.go:130] >     ]
	I1201 19:26:09.823313   48804 command_runner.go:130] >   }
	I1201 19:26:09.823316   48804 command_runner.go:130] > }
	I1201 19:26:09.824829   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:09.824854   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:09.824874   48804 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:26:09.824897   48804 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:26:09.825029   48804 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:26:09.825110   48804 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:26:09.833035   48804 command_runner.go:130] > kubeadm
	I1201 19:26:09.833056   48804 command_runner.go:130] > kubectl
	I1201 19:26:09.833061   48804 command_runner.go:130] > kubelet
	I1201 19:26:09.833076   48804 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:26:09.833134   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:26:09.840788   48804 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:26:09.853581   48804 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:26:09.866488   48804 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1201 19:26:09.879364   48804 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:26:09.883102   48804 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1201 19:26:09.883255   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:10.007542   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:10.337813   48804 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:26:10.337836   48804 certs.go:195] generating shared ca certs ...
	I1201 19:26:10.337853   48804 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:10.338014   48804 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:26:10.338073   48804 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:26:10.338085   48804 certs.go:257] generating profile certs ...
	I1201 19:26:10.338185   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:26:10.338247   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:26:10.338297   48804 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:26:10.338309   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1201 19:26:10.338322   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1201 19:26:10.338339   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1201 19:26:10.338351   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1201 19:26:10.338365   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1201 19:26:10.338377   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1201 19:26:10.338392   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1201 19:26:10.338406   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1201 19:26:10.338461   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:26:10.338495   48804 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:26:10.338507   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:26:10.338544   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:26:10.338574   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:26:10.338602   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:26:10.338653   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:10.338691   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.338709   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.338720   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem -> /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.339292   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:26:10.367504   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:26:10.391051   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:26:10.410924   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:26:10.429158   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:26:10.447137   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:26:10.464077   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:26:10.481473   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:26:10.498763   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:26:10.516542   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:26:10.534712   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:26:10.552802   48804 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:26:10.565633   48804 ssh_runner.go:195] Run: openssl version
	I1201 19:26:10.571657   48804 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1201 19:26:10.572092   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:26:10.580812   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584562   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584589   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584650   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.625269   48804 command_runner.go:130] > 3ec20f2e
	I1201 19:26:10.625746   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:26:10.633767   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:26:10.642160   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.645995   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646248   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646315   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.686937   48804 command_runner.go:130] > b5213941
	I1201 19:26:10.687439   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:26:10.695499   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:26:10.704517   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708133   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708431   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708519   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.749422   48804 command_runner.go:130] > 51391683
	I1201 19:26:10.749951   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:26:10.758524   48804 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762526   48804 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762565   48804 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1201 19:26:10.762572   48804 command_runner.go:130] > Device: 259,1	Inode: 1053621     Links: 1
	I1201 19:26:10.762579   48804 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:10.762585   48804 command_runner.go:130] > Access: 2025-12-01 19:22:03.818228473 +0000
	I1201 19:26:10.762590   48804 command_runner.go:130] > Modify: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762599   48804 command_runner.go:130] > Change: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762604   48804 command_runner.go:130] >  Birth: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762682   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:26:10.803623   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.804107   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:26:10.845983   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.846486   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:26:10.887221   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.887637   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:26:10.928253   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.928695   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:26:10.970677   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.971198   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:26:11.012420   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:11.012544   48804 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:11.012658   48804 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:26:11.012733   48804 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:26:11.044110   48804 cri.go:89] found id: ""
	I1201 19:26:11.044177   48804 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:26:11.054430   48804 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1201 19:26:11.054508   48804 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1201 19:26:11.054530   48804 command_runner.go:130] > /var/lib/minikube/etcd:
	I1201 19:26:11.054631   48804 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:26:11.054642   48804 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:26:11.054719   48804 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:26:11.063470   48804 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:26:11.063923   48804 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.064051   48804 kubeconfig.go:62] /home/jenkins/minikube-integration/21997-2497/kubeconfig needs updating (will repair): [kubeconfig missing "functional-428744" cluster setting kubeconfig missing "functional-428744" context setting]
	I1201 19:26:11.064410   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.064918   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.065081   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.065855   48804 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1201 19:26:11.065877   48804 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1201 19:26:11.065883   48804 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1201 19:26:11.065889   48804 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1201 19:26:11.065893   48804 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1201 19:26:11.065945   48804 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1201 19:26:11.066161   48804 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:26:11.074525   48804 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1201 19:26:11.074603   48804 kubeadm.go:602] duration metric: took 19.955614ms to restartPrimaryControlPlane
	I1201 19:26:11.074623   48804 kubeadm.go:403] duration metric: took 62.08191ms to StartCluster
	I1201 19:26:11.074644   48804 settings.go:142] acquiring lock: {Name:mk0c68be267fd1e06eeb79721201896d000b433c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.074712   48804 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.075396   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.075623   48804 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 19:26:11.076036   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:11.076070   48804 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1201 19:26:11.076207   48804 addons.go:70] Setting storage-provisioner=true in profile "functional-428744"
	I1201 19:26:11.076225   48804 addons.go:239] Setting addon storage-provisioner=true in "functional-428744"
	I1201 19:26:11.076239   48804 addons.go:70] Setting default-storageclass=true in profile "functional-428744"
	I1201 19:26:11.076254   48804 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-428744"
	I1201 19:26:11.076255   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.076600   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.076785   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.081245   48804 out.go:179] * Verifying Kubernetes components...
	I1201 19:26:11.087150   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:11.117851   48804 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:26:11.119516   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.119671   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.120991   48804 addons.go:239] Setting addon default-storageclass=true in "functional-428744"
	I1201 19:26:11.121044   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.121546   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.121741   48804 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.121759   48804 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1201 19:26:11.121797   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.157953   48804 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:11.157978   48804 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1201 19:26:11.158049   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.182138   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.197665   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.313464   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:11.333888   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.351804   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.088419   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088456   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088499   48804 retry.go:31] will retry after 370.622111ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088535   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088549   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088556   48804 retry.go:31] will retry after 214.864091ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088649   48804 node_ready.go:35] waiting up to 6m0s for node "functional-428744" to be "Ready" ...
	I1201 19:26:12.088787   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.088873   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.089197   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.304654   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.362814   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.366340   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.366413   48804 retry.go:31] will retry after 398.503688ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.459632   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:12.519830   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.523259   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.523294   48804 retry.go:31] will retry after 535.054731ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.589478   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.589570   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.589862   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.765159   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.827324   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.827370   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.827390   48804 retry.go:31] will retry after 739.755241ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.058728   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.089511   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.089585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.089856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.118077   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.118134   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.118154   48804 retry.go:31] will retry after 391.789828ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.510836   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.567332   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:13.570397   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.574026   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.574060   48804 retry.go:31] will retry after 1.18201014s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.589346   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.589417   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.589845   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.644640   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.644678   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.644695   48804 retry.go:31] will retry after 732.335964ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.089422   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.089515   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:14.089961   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:14.377221   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:14.438375   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.438421   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.438440   48804 retry.go:31] will retry after 1.236140087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.589732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.589826   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.590183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:14.756655   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:14.814049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.817149   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.817181   48804 retry.go:31] will retry after 1.12716485s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.089765   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.089856   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.090157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.588981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.675732   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:15.741410   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:15.741450   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.741469   48804 retry.go:31] will retry after 1.409201229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.944883   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:16.007405   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:16.007500   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.007543   48804 retry.go:31] will retry after 1.898784229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.089691   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.089768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.090129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:16.090198   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:16.589482   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.589810   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.089728   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.151412   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:17.212400   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.212446   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.212468   48804 retry.go:31] will retry after 4.05952317s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.588902   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.589279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.906643   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:17.968049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.968156   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.968182   48804 retry.go:31] will retry after 2.840296794s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:18.089284   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.089352   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.089631   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:18.588972   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.589046   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:18.589394   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:19.089061   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.089132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.089421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:19.588859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.588929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.589194   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.088895   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.089306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.808702   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:20.866089   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:20.869253   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:20.869291   48804 retry.go:31] will retry after 4.860979312s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.089785   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.089854   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.090172   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:21.090222   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:21.272551   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:21.327980   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:21.331648   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.331684   48804 retry.go:31] will retry after 4.891109087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.089331   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.089409   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.089753   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.589555   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.589684   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.089701   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.089772   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.090125   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.589808   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.590266   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:23.590323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:24.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.089005   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.089273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:24.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.589377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.089325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.730733   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:25.787142   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:25.790610   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:25.790640   48804 retry.go:31] will retry after 7.92097549s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:26.089409   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:26.223678   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:26.278607   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:26.281989   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.282022   48804 retry.go:31] will retry after 7.531816175s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.589432   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.589521   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.589840   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.089669   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.089751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.090069   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.589693   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.589764   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.590089   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:28.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.089997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.090335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:28.090387   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:28.589510   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.589583   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.589844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.089683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.090056   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.589880   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.589968   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.590369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:30.109683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.109762   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.110054   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:30.110098   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:30.589806   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.590200   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.089177   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.089252   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.089645   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.589198   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.589085   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:32.589565   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:33.089830   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.089902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.090208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.712788   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:33.771136   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.774250   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.774284   48804 retry.go:31] will retry after 5.105632097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.814618   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:33.891338   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.891375   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.891394   48804 retry.go:31] will retry after 5.576720242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:34.089900   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.089994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.090334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:34.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.589260   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:35.088913   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.088982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:35.089359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:35.589057   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.589129   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.589530   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.089310   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:37.089182   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.089255   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:37.089610   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:37.589091   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.589170   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.589433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.089395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.880960   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:38.943302   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:38.943343   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:38.943363   48804 retry.go:31] will retry after 13.228566353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.089598   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.089672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.089960   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:39.090011   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:39.469200   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:39.525826   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:39.528963   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.528998   48804 retry.go:31] will retry after 17.183760318s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.589169   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.589241   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.589577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.089008   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.089433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.089139   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.089214   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.089595   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.589301   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.589750   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:41.589806   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:42.089592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.089667   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.089940   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:42.589720   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.589791   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.590109   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.089757   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.089835   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.090111   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.589514   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.589585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.589848   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:43.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:44.089653   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:44.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.589895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:45.090381   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.090466   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.092630   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1201 19:26:45.589592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.589673   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.590001   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:45.590051   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:46.089834   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.089916   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.090265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:46.588963   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.089324   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.089402   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.089734   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.589563   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.589642   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.590061   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:47.590178   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:48.089732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.089808   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.090071   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:48.589851   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.589928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.590267   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.088857   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.088930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.089271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.590253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:49.590304   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:50.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:50.589030   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.589106   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.089232   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.089302   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.089614   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.589210   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.589283   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.589653   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:52.089565   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.089648   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.089984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:52.090044   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:52.172403   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:52.228163   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:52.231129   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.231163   48804 retry.go:31] will retry after 19.315790709s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.589726   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.589977   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.089744   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.089824   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.589934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.590235   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.589137   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.589243   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.589618   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:54.589675   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:55.089338   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.089423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.089771   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:55.589523   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.589594   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.589856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.089664   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.588804   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.588881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.713576   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:56.772710   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:56.775873   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:56.775910   48804 retry.go:31] will retry after 15.04087383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:57.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.089334   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.089591   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:57.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:57.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.589000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.588867   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.589237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.088980   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.089051   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.089363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.589124   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.589220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.589536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:59.589590   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:00.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.089350   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.089679   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:00.589522   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.589597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.589979   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.088921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.089174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.588991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.589359   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:02.089003   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.089441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:02.089520   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:02.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.588931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:04.089192   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.089265   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:04.089578   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:04.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.589355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.088902   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.088977   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.589296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.088932   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:06.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:07.088819   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.089191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:07.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.589307   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.589807   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.589880   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.590129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:08.590170   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:09.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.089269   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:09.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.589272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.589096   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.589428   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.089194   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.089643   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:11.089702   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:11.547197   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:11.589587   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.589653   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.589873   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.606598   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.609801   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.609839   48804 retry.go:31] will retry after 19.642669348s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.817534   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:11.881682   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.881743   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.881763   48804 retry.go:31] will retry after 44.665994167s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:12.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:12.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.088988   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:13.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:14.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:14.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.088989   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.089075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.089465   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.589182   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.589270   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:15.589609   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:16.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:16.588919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.589317   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.089377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.588846   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.589232   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:18.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:18.089371   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:18.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.589350   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.088809   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.088891   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.089153   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.588895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:20.089911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.089989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.090331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:20.090392   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:20.589054   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.589132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.589374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.089343   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.089681   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.589436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.589535   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.088935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.089210   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.588895   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.588975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:22.589363   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:23.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.089301   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:23.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.589747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.589992   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.089847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.089932   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.090273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.588986   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:24.589445   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:25.089736   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.089809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.090059   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:25.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.588915   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.089346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.588967   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:27.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.089316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:27.089370   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:27.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.089038   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.089114   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:29.089044   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.089124   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.089459   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:29.089532   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:29.589183   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.589521   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.089020   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.089103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.089462   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.088828   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.088907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.089239   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.252679   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:31.310178   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:31.313107   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.313144   48804 retry.go:31] will retry after 31.234541362s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.589652   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.589739   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.590099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:31.590157   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:32.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:32.589064   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.589140   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.089586   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.589302   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.589377   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:34.089480   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.089566   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.089825   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:34.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:34.589708   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.589788   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.088875   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.088959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.089209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.588958   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.589291   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:36.589344   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:37.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.089244   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:37.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.589284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.589536   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.589614   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.589859   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:38.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:39.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.089743   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.090090   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:39.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.590181   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.088979   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.089261   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.589033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.589335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:41.089238   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.089312   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.089670   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:41.089726   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:41.589477   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.589816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.089787   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.089858   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.090183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.088926   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.589305   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:43.589360   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:44.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:44.589583   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.589664   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.589930   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.089936   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.090240   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:45.589423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:46.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:46.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.088993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.089287   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.588825   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.588900   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.589160   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:48.088919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.089001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:48.089402   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:48.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.589148   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.089140   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.089204   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.089439   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.588992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:50.088977   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:50.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:50.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.089222   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.089296   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.089666   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.589233   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.589315   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.589663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:52.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.089519   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.089816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:52.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:52.589625   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.589697   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.089857   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.089935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.090294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.089419   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.588992   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.589064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.589387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:54.589442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:55.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.088988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:55.589056   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.589135   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.589478   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.089109   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.548010   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:56.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.589293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.618422   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621596   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621692   48804 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:27:57.089694   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.089774   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.090105   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:57.090156   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:57.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.089844   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.089911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.090167   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.588968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.089080   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.089152   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.089448   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.589149   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.589228   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.589504   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:59.589556   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:00.089006   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:00.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.089210   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.089282   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.088960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:02.089423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:02.547921   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:28:02.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.589300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.609226   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612351   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612446   48804 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:28:02.615606   48804 out.go:179] * Enabled addons: 
	I1201 19:28:02.619164   48804 addons.go:530] duration metric: took 1m51.54309696s for enable addons: enabled=[]
	I1201 19:28:03.089670   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.090185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:03.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.089110   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.588949   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.589049   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.589402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:04.589461   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:05.089449   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.089546   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.089857   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:05.589588   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.589671   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.589935   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.089746   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.089819   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.090155   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.588853   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.588925   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.589422   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:07.089306   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.089384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.089671   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:07.089725   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:07.589476   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.089738   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.090110   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.589762   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.589829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.590138   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:09.589404   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:10.089052   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.089126   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:10.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.089232   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.089589   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.589170   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.589715   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:11.589763   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:12.089752   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.089829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:12.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.588998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.089832   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.089899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.090285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.588827   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.588899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.589250   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:14.088849   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.089292   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:14.089362   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:14.589658   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.589982   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.089907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.089992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.090441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.589364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:16.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.089055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:16.089598   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:16.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.589342   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.088984   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.589065   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:18.589385   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:19.089029   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:19.588898   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.089123   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.089516   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.588858   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.588926   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:21.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.089230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:21.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:21.588946   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.589356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.089252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.588847   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.588920   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.589241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.588809   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:23.589269   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:24.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:24.588960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.589427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.089763   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.089831   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.090097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.589881   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.589959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.590297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:25.590357   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:26.089013   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.089528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:26.589214   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.589286   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.589603   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.089467   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.089559   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.089881   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.589752   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.590104   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:28.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.089776   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.090051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:28.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:28.589863   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.589941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.590271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.089376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.589270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.088976   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.089446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.589171   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.589249   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.589613   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:30.589671   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:31.089382   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.089449   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.089763   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:31.589556   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.589638   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.589939   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.088836   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:33.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.089356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:33.089416   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:33.588938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.088874   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.089304   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.089364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:35.589306   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:36.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.089000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.089328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:36.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.589327   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.089234   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:37.589407   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:38.089085   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.089167   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.089517   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:38.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.588949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.589220   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.089344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:40.096455   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.096551   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.096874   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:40.097064   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:40.589786   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.589855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.590188   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.089116   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.089196   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.089535   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.589129   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.589203   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.589458   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.089448   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.089553   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.589577   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.589661   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.590007   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:42.590065   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:43.089576   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.089651   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.089904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:43.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.589746   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.590046   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.089837   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.089907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.090256   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.588933   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:45.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.089331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:45.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:45.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.589171   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.089921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.090252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:47.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.089037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.089393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:47.089451   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:47.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.588928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.589192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.088891   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.089303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.589063   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:49.089120   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.089200   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.089463   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:49.089529   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:49.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.588993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.088816   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.088895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.089241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.589245   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:51.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.089220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:51.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:51.589292   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.589374   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.589732   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.089536   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.089603   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.089870   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.589721   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.589798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.590135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.088861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.089284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.588978   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:53.589377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:54.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.089061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.089555   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:54.589299   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.589391   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.589805   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.089595   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.089665   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.089924   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.589751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:55.590097   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:56.089729   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.089807   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:56.589823   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.589890   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.589032   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.589112   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:58.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.089269   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.089543   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:58.089583   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:58.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.589585   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.589652   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.589904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:00.090091   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.090176   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.090503   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:00.090549   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:00.589349   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.589423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.589759   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.089644   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.089715   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.089978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.589828   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.589917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.590306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.588896   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.589271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:02.589323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:03.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:03.589098   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.589185   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.589576   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.089251   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.089329   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.089606   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.589278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:05.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:05.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:05.588996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.589369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.589275   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.088820   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.088892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.089135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.589860   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.589935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.590230   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:07.590276   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:08.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.089375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:08.588887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.588960   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.589213   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.088905   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.088991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.089309   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.589025   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.589102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.589421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:10.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.089125   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:10.089434   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:10.589102   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.589179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.589460   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.089329   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.089406   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.089844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.589591   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.589659   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.088917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:12.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:13.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:13.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.589047   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.089033   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.089449   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.588871   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.588938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:15.089001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:15.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:15.589096   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.589199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.589514   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.089742   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.090072   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.589844   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.589924   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.590265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:17.088934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:17.089471   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:17.589173   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.589246   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.589526   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.088963   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.089323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.589022   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.088922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.089208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.588956   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:19.589431   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:20.089101   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.089182   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.089476   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:20.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.589182   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.089165   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.089546   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.589316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:22.089229   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.089646   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:22.089715   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:22.589537   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.589607   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.589906   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.089700   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.089798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.090113   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.590144   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.088883   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.089296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.589001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.589080   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:24.589410   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:25.088898   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:25.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.089398   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.588893   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.589273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:27.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:27.089379   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:27.589046   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.589122   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.589420   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.089204   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.589028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:29.089056   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.089134   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.089452   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:29.089528   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:29.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.589233   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.589511   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.088929   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.089391   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.589289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:31.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.089217   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.089510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:31.089554   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:31.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.089002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.589582   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.589657   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:33.089719   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.089796   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:33.090228   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:33.588933   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.089059   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.089141   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.089472   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.089077   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.089208   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.089761   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.589549   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.589624   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:35.589927   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:36.089659   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.089734   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:36.588832   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.589251   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.088965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.089289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:38.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.089408   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:38.089459   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:38.588843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.589178   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.088880   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.088961   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.089264   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.588969   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.589385   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.088901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.588965   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.589041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:40.589403   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:41.089288   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.089366   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.089704   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:41.589423   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.589506   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.589815   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.089782   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.089864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.090168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.589534   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:42.589596   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:43.089242   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.089663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:43.589454   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.589549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.589901   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.089759   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.089838   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.090150   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.589175   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:45.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.089091   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:45.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:45.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.088887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.089311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.589071   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.589393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:47.089406   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.089500   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.089826   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:47.089884   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:47.589600   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.589672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.589966   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.089769   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.090162   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.588884   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.589326   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.588995   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.589073   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.589417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:49.589467   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:50.089159   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.089254   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.089647   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:50.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.589215   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.089071   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.089145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.089475   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:52.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.089238   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:52.089288   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:52.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.589750   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.589814   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.590123   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:54.089823   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.089898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.090247   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:54.090303   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:54.588852   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.588930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.089270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.588966   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.089360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.589038   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.589104   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.589401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:56.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:57.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.089090   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:57.588994   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.589111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.089014   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.089394   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.588939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.589358   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:59.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.089299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:59.089369   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:59.589697   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.589768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.089253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:01.089610   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.089745   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:01.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:01.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.589966   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.590319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.089172   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.089260   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.089600   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.088937   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.588977   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.589052   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:03.589478   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:04.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:04.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.589299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.088972   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.589752   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.589827   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:05.590195   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:06.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:06.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.088896   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.089282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:08.089094   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.089179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.089559   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:08.089615   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:08.589268   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.589341   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.589676   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.089519   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.089597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.089926   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.589719   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.589797   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.590134   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.088842   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.088923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:10.589466   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:11.089455   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.089549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.089928   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:11.589660   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.589731   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.589984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.089097   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.089199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.589383   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.589475   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.589880   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:12.589952   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:13.089681   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.089750   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:13.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.590299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.589280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:15.088982   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:15.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:15.589628   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.589698   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.590008   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.089799   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.090158   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.588891   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.588824   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:17.589312   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:18.088965   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.089050   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:18.589104   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.589181   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.589539   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.089070   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.089333   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.589020   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:19.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:20.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.089400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:20.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.589230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.589528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.089246   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.089319   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.089743   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.589336   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.589427   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.589837   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:21.589900   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:22.089716   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.089803   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.090099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:22.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.589969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.590315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.589157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:24.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.089010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.089362   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:24.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:24.589099   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.589172   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.589544   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.089055   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.089127   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.089434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.588952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:26.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.089020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:26.089524   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:26.588879   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.088972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.089314   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.589053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.589130   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.589456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.088844   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.089168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:28.589390   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:29.088940   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.089009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:29.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.589072   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.589384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.089095   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.089945   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.589738   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.590195   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:30.590251   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:31.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.089111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.089438   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:31.588985   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.589056   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.088915   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.588989   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.589324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:33.088946   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.089384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:33.089440   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:33.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.589343   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.089030   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.589373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:35.089090   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.089168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:35.089625   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:35.588821   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.589161   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.088971   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.089321   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.589745   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.589817   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.590097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:37.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.089699   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.089969   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:37.090012   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:37.589546   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.589637   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.589963   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.089733   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.089804   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.090142   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.589803   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.589876   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.088981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.089329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.589036   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.589107   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:39.589515   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:40.088909   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.089345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:40.589047   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.589120   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.089564   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.089897   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.589633   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.589911   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:41.589956   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:42.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.089280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.588990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.588921   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:44.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.089337   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:44.089388   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:44.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.589923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.590187   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.089403   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.589135   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.589226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.589637   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.088870   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.089279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.589345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:46.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:47.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:47.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.589110   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.589189   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.589550   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:48.589608   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:49.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:49.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.588965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.589274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.588817   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.588886   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.589146   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:51.089119   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.089223   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.089571   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:51.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:51.589291   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.089674   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.090013   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.589771   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.589847   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:53.089897   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.089975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.090297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:53.090359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:53.589788   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.589869   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.590118   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.088938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.089272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.588948   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.588997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:55.589383   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:56.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.089147   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.089578   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:56.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.589253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.088920   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:58.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.089773   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.090032   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:58.090073   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:58.589805   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.589877   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.088885   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.088954   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.089285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.589709   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.589783   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.590045   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:00.089976   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.090062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.090455   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:00.090523   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:00.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.589022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.089193   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.089258   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.089567   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.589248   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.589320   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.589696   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.089617   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.089689   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.090033   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.589742   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.589809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.590065   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:02.590107   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:03.089840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.089919   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.090274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:03.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.588964   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.088940   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.089202   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:05.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:05.089397   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:05.589817   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.589881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.590139   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.088913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.089226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.089268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.589804   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:07.590283   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:08.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.089427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:08.588851   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.088893   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.588982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:10.088978   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.089059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.089383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:10.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.589086   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.589443   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.089293   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.089375   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.089754   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.588862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:12.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:12.089477   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:12.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.088863   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.089236   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.589400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.089062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.589792   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.589864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.590157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:14.590205   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:15.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.089998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.090393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:15.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.089755   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.089823   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.090149   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:17.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:17.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:17.589041   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.589117   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.089353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.589024   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.589103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.089036   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.589006   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:19.589433   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:20.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.089045   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:20.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.589892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.590189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.089147   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.089226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:22.089347   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.089422   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.089710   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:22.089757   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:22.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.589640   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.589978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.089774   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.090209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.589840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.589913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.590166   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.089300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.589281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:24.589334   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:25.089831   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.089896   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:25.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.589601   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.589668   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.589943   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:26.589982   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:27.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.088939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:27.588870   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.588951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.089205   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.589061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.589381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:29.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:29.089377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:29.588973   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.088974   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.089053   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.089429   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.589066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:31.089200   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.089577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:31.089637   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:31.588904   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.089340   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.089680   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.589442   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.589524   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.589781   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:33.089603   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.089675   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.089988   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:33.090052   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:33.589773   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.590174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.089801   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.090171   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.589294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.089011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.589740   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.589810   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.590064   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:35.590105   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:36.089859   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.089929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.090255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:36.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.589001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.089192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:38.088964   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:38.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:38.588876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.588953   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.589211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.089039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.089017   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.089088   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.589633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.589707   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.590026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:40.590081   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:41.089028   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:41.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.589178   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.589434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.089390   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.089474   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.089854   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:42.590148   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:43.089748   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.089825   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.090133   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:43.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.589249   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.588882   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.589201   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:45.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.089330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:45.089382   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:45.589179   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.589564   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.089258   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.089345   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.089648   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.589361   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.589436   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.589775   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:47.089602   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.089682   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.090003   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:47.090068   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:47.589765   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.088918   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.089233   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.589032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.089114   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.089186   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.089669   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.589463   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.589550   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.589841   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:49.589889   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:50.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.089706   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.090067   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:50.589698   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.589782   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.590096   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.089244   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:52.088879   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:52.089255   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:52.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.589094   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.589168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.589423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:54.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:54.089441   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:54.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.088999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.089276   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.589378   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:56.088961   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.089405   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:56.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:56.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.588950   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.089074   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.589339   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.089278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.589453   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:58.589546   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:59.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:59.588990   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.589367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.089002   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.089412   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.589613   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.589705   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:00.590166   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:01.088830   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.089237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:01.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.089351   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.089432   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.089784   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.589531   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.589609   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.589892   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:03.089722   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:03.090212   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:03.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.589338   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.089007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.588999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.089026   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.089164   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.089649   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.589145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.589411   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:05.589452   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:06.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.089008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:06.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.089795   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.089860   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.090124   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.588839   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.589229   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:08.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:08.089432   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:08.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.088948   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.589158   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.589644   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:10.089588   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.089666   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.090026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:10.090112   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:10.588810   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.589228   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.089103   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.089180   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.089540   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:12.088890   48804 type.go:168] "Request Body" body=""
	I1201 19:32:12.089251   48804 node_ready.go:38] duration metric: took 6m0.000540563s for node "functional-428744" to be "Ready" ...
	I1201 19:32:12.092425   48804 out.go:203] 
	W1201 19:32:12.095253   48804 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1201 19:32:12.095277   48804 out.go:285] * 
	* 
	W1201 19:32:12.097463   48804 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:32:12.100606   48804 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-428744 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 6m5.859121255s for "functional-428744" cluster.
I1201 19:32:12.568910    4305 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (403.848098ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 logs -n 25: (1.010885504s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /usr/share/ca-certificates/43052.pem                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /etc/test/nested/copy/4305/hosts                                                                                                 │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image save kicbase/echo-server:functional-019259 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ update-context │ functional-019259 update-context --alsologtostderr -v=2                                                                                                         │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ update-context │ functional-019259 update-context --alsologtostderr -v=2                                                                                                         │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image rm kicbase/echo-server:functional-019259 --alsologtostderr                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image save --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format yaml --alsologtostderr                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format short --alsologtostderr                                                                                                     │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format json --alsologtostderr                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format table --alsologtostderr                                                                                                     │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh pgrep buildkitd                                                                                                                           │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image          │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                          │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete         │ -p functional-019259                                                                                                                                            │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start          │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ start          │ -p functional-428744 --alsologtostderr -v=8                                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:26 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:26:06
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:26:06.760311   48804 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:26:06.760471   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760480   48804 out.go:374] Setting ErrFile to fd 2...
	I1201 19:26:06.760485   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760749   48804 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:26:06.761114   48804 out.go:368] Setting JSON to false
	I1201 19:26:06.761974   48804 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4118,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:26:06.762048   48804 start.go:143] virtualization:  
	I1201 19:26:06.765446   48804 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:26:06.769259   48804 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:26:06.769379   48804 notify.go:221] Checking for updates...
	I1201 19:26:06.775400   48804 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:26:06.778339   48804 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:06.781100   48804 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:26:06.784047   48804 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:26:06.786945   48804 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:26:06.790355   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:06.790504   48804 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:26:06.817889   48804 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:26:06.818002   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.874928   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.865437959 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.875040   48804 docker.go:319] overlay module found
	I1201 19:26:06.878298   48804 out.go:179] * Using the docker driver based on existing profile
	I1201 19:26:06.881322   48804 start.go:309] selected driver: docker
	I1201 19:26:06.881345   48804 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.881455   48804 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:26:06.881703   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.946129   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.93658681 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.946541   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:06.946612   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:06.946692   48804 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.949952   48804 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:26:06.952666   48804 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:26:06.955511   48804 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:26:06.958482   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:06.958560   48804 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:26:06.978189   48804 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:26:06.978215   48804 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:26:07.013576   48804 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:26:07.245550   48804 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:26:07.245729   48804 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:26:07.245814   48804 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245902   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:26:07.245911   48804 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 111.155µs
	I1201 19:26:07.245925   48804 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:26:07.245935   48804 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245965   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:26:07.245971   48804 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 37.068µs
	I1201 19:26:07.245977   48804 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:26:07.245979   48804 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:26:07.245986   48804 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246018   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:26:07.246022   48804 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 37.371µs
	I1201 19:26:07.246020   48804 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246029   48804 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246041   48804 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246063   48804 start.go:364] duration metric: took 29.397µs to acquireMachinesLock for "functional-428744"
	I1201 19:26:07.246076   48804 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:26:07.246081   48804 fix.go:54] fixHost starting: 
	I1201 19:26:07.246083   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:26:07.246089   48804 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 51.212µs
	I1201 19:26:07.246094   48804 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246103   48804 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246129   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:26:07.246135   48804 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 32.744µs
	I1201 19:26:07.246145   48804 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246154   48804 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246179   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:26:07.246184   48804 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.013µs
	I1201 19:26:07.246189   48804 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:26:07.246197   48804 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246221   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:26:07.246225   48804 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 29.356µs
	I1201 19:26:07.246230   48804 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:26:07.246238   48804 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246268   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:26:07.246273   48804 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 35.526µs
	I1201 19:26:07.246278   48804 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:26:07.246288   48804 cache.go:87] Successfully saved all images to host disk.
	I1201 19:26:07.246352   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:07.263626   48804 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:26:07.263658   48804 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:26:07.267042   48804 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:26:07.267094   48804 machine.go:94] provisionDockerMachine start ...
	I1201 19:26:07.267191   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.284298   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.284633   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.284647   48804 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:26:07.445599   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.445668   48804 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:26:07.445742   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.466448   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.466762   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.466780   48804 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:26:07.626795   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.626872   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.646204   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.646540   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.646566   48804 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:26:07.797736   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:26:07.797765   48804 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:26:07.797791   48804 ubuntu.go:190] setting up certificates
	I1201 19:26:07.797801   48804 provision.go:84] configureAuth start
	I1201 19:26:07.797871   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:07.815670   48804 provision.go:143] copyHostCerts
	I1201 19:26:07.815726   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815768   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:26:07.815790   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815876   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:26:07.815970   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.815990   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:26:07.815998   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.816026   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:26:07.816080   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816100   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:26:07.816107   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816131   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:26:07.816190   48804 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:26:07.904001   48804 provision.go:177] copyRemoteCerts
	I1201 19:26:07.904069   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:26:07.904109   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.922469   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.029518   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1201 19:26:08.029579   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:26:08.047419   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1201 19:26:08.047495   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:26:08.069296   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1201 19:26:08.069377   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:26:08.088982   48804 provision.go:87] duration metric: took 291.155414ms to configureAuth
	I1201 19:26:08.089064   48804 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:26:08.089321   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:08.089350   48804 machine.go:97] duration metric: took 822.24428ms to provisionDockerMachine
	I1201 19:26:08.089385   48804 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:26:08.089416   48804 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:26:08.089542   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:26:08.089633   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.112132   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.217325   48804 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:26:08.220778   48804 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1201 19:26:08.220802   48804 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1201 19:26:08.220808   48804 command_runner.go:130] > VERSION_ID="12"
	I1201 19:26:08.220813   48804 command_runner.go:130] > VERSION="12 (bookworm)"
	I1201 19:26:08.220817   48804 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1201 19:26:08.220820   48804 command_runner.go:130] > ID=debian
	I1201 19:26:08.220825   48804 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1201 19:26:08.220831   48804 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1201 19:26:08.220837   48804 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1201 19:26:08.220885   48804 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:26:08.220907   48804 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:26:08.220919   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:26:08.220978   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:26:08.221055   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:26:08.221066   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /etc/ssl/certs/43052.pem
	I1201 19:26:08.221140   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:26:08.221148   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> /etc/test/nested/copy/4305/hosts
	I1201 19:26:08.221198   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:26:08.229002   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:08.246695   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:26:08.263789   48804 start.go:296] duration metric: took 174.371826ms for postStartSetup
	I1201 19:26:08.263869   48804 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:26:08.263931   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.281235   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.382466   48804 command_runner.go:130] > 12%
	I1201 19:26:08.382557   48804 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:26:08.386763   48804 command_runner.go:130] > 172G
	I1201 19:26:08.387182   48804 fix.go:56] duration metric: took 1.141096136s for fixHost
	I1201 19:26:08.387210   48804 start.go:83] releasing machines lock for "functional-428744", held for 1.141138241s
	I1201 19:26:08.387280   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:08.405649   48804 ssh_runner.go:195] Run: cat /version.json
	I1201 19:26:08.405673   48804 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:26:08.405720   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.405736   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.424898   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.435929   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.615638   48804 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1201 19:26:08.615700   48804 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1201 19:26:08.615817   48804 ssh_runner.go:195] Run: systemctl --version
	I1201 19:26:08.621830   48804 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1201 19:26:08.621881   48804 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1201 19:26:08.622279   48804 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1201 19:26:08.626405   48804 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1201 19:26:08.626689   48804 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:26:08.626779   48804 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:26:08.634801   48804 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:26:08.634864   48804 start.go:496] detecting cgroup driver to use...
	I1201 19:26:08.634909   48804 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:26:08.634995   48804 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:26:08.650643   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:26:08.663900   48804 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:26:08.663962   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:26:08.680016   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:26:08.693295   48804 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:26:08.807192   48804 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:26:08.949829   48804 docker.go:234] disabling docker service ...
	I1201 19:26:08.949910   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:26:08.965005   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:26:08.978389   48804 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:26:09.113220   48804 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:26:09.265765   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:26:09.280775   48804 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:26:09.295503   48804 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1201 19:26:09.296833   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:26:09.307263   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:26:09.316009   48804 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:26:09.316129   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:26:09.324849   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.333586   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:26:09.341989   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.350174   48804 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:26:09.358089   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:26:09.366694   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:26:09.375459   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:26:09.384162   48804 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:26:09.390646   48804 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1201 19:26:09.391441   48804 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:26:09.398673   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:09.519779   48804 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:26:09.650665   48804 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:26:09.650790   48804 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:26:09.655039   48804 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1201 19:26:09.655139   48804 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1201 19:26:09.655166   48804 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1201 19:26:09.655199   48804 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:09.655222   48804 command_runner.go:130] > Access: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655243   48804 command_runner.go:130] > Modify: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655266   48804 command_runner.go:130] > Change: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655294   48804 command_runner.go:130] >  Birth: -
	I1201 19:26:09.655330   48804 start.go:564] Will wait 60s for crictl version
	I1201 19:26:09.655409   48804 ssh_runner.go:195] Run: which crictl
	I1201 19:26:09.659043   48804 command_runner.go:130] > /usr/local/bin/crictl
	I1201 19:26:09.659221   48804 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:26:09.684907   48804 command_runner.go:130] > Version:  0.1.0
	I1201 19:26:09.684979   48804 command_runner.go:130] > RuntimeName:  containerd
	I1201 19:26:09.684999   48804 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1201 19:26:09.685021   48804 command_runner.go:130] > RuntimeApiVersion:  v1
	I1201 19:26:09.687516   48804 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:26:09.687623   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.708580   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.710309   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.728879   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.737012   48804 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:26:09.739912   48804 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:26:09.756533   48804 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:26:09.760816   48804 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1201 19:26:09.760978   48804 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:26:09.761088   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:09.761147   48804 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:26:09.788491   48804 command_runner.go:130] > {
	I1201 19:26:09.788509   48804 command_runner.go:130] >   "images":  [
	I1201 19:26:09.788514   48804 command_runner.go:130] >     {
	I1201 19:26:09.788524   48804 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1201 19:26:09.788529   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788534   48804 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1201 19:26:09.788538   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788542   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788546   48804 command_runner.go:130] >       "size":  "8032639",
	I1201 19:26:09.788552   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788556   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788560   48804 command_runner.go:130] >     },
	I1201 19:26:09.788563   48804 command_runner.go:130] >     {
	I1201 19:26:09.788570   48804 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1201 19:26:09.788573   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788578   48804 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1201 19:26:09.788582   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788586   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788598   48804 command_runner.go:130] >       "size":  "21166088",
	I1201 19:26:09.788603   48804 command_runner.go:130] >       "username":  "nonroot",
	I1201 19:26:09.788611   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788615   48804 command_runner.go:130] >     },
	I1201 19:26:09.788617   48804 command_runner.go:130] >     {
	I1201 19:26:09.788624   48804 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1201 19:26:09.788628   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788633   48804 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1201 19:26:09.788636   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788639   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788643   48804 command_runner.go:130] >       "size":  "21134420",
	I1201 19:26:09.788647   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788651   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788654   48804 command_runner.go:130] >       },
	I1201 19:26:09.788658   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788662   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788665   48804 command_runner.go:130] >     },
	I1201 19:26:09.788668   48804 command_runner.go:130] >     {
	I1201 19:26:09.788675   48804 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1201 19:26:09.788678   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788685   48804 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1201 19:26:09.788689   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788692   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788697   48804 command_runner.go:130] >       "size":  "24676285",
	I1201 19:26:09.788700   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788704   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788707   48804 command_runner.go:130] >       },
	I1201 19:26:09.788711   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788715   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788718   48804 command_runner.go:130] >     },
	I1201 19:26:09.788721   48804 command_runner.go:130] >     {
	I1201 19:26:09.788728   48804 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1201 19:26:09.788732   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788739   48804 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1201 19:26:09.788743   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788750   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788755   48804 command_runner.go:130] >       "size":  "20658969",
	I1201 19:26:09.788759   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788762   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788765   48804 command_runner.go:130] >       },
	I1201 19:26:09.788769   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788773   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788776   48804 command_runner.go:130] >     },
	I1201 19:26:09.788779   48804 command_runner.go:130] >     {
	I1201 19:26:09.788786   48804 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1201 19:26:09.788790   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788795   48804 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1201 19:26:09.788799   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788803   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788807   48804 command_runner.go:130] >       "size":  "22428165",
	I1201 19:26:09.788814   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788818   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788822   48804 command_runner.go:130] >     },
	I1201 19:26:09.788825   48804 command_runner.go:130] >     {
	I1201 19:26:09.788832   48804 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1201 19:26:09.788835   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788841   48804 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1201 19:26:09.788844   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788855   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788860   48804 command_runner.go:130] >       "size":  "15389290",
	I1201 19:26:09.788863   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788867   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788870   48804 command_runner.go:130] >       },
	I1201 19:26:09.788874   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788878   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788881   48804 command_runner.go:130] >     },
	I1201 19:26:09.788883   48804 command_runner.go:130] >     {
	I1201 19:26:09.788890   48804 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1201 19:26:09.788897   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788902   48804 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1201 19:26:09.788905   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788908   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788912   48804 command_runner.go:130] >       "size":  "265458",
	I1201 19:26:09.788920   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788924   48804 command_runner.go:130] >         "value":  "65535"
	I1201 19:26:09.788927   48804 command_runner.go:130] >       },
	I1201 19:26:09.788931   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788934   48804 command_runner.go:130] >       "pinned":  true
	I1201 19:26:09.788937   48804 command_runner.go:130] >     }
	I1201 19:26:09.788940   48804 command_runner.go:130] >   ]
	I1201 19:26:09.788943   48804 command_runner.go:130] > }
	I1201 19:26:09.791239   48804 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:26:09.791264   48804 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:26:09.791273   48804 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:26:09.791374   48804 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:26:09.791446   48804 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:26:09.822661   48804 command_runner.go:130] > {
	I1201 19:26:09.822679   48804 command_runner.go:130] >   "cniconfig": {
	I1201 19:26:09.822684   48804 command_runner.go:130] >     "Networks": [
	I1201 19:26:09.822688   48804 command_runner.go:130] >       {
	I1201 19:26:09.822694   48804 command_runner.go:130] >         "Config": {
	I1201 19:26:09.822699   48804 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1201 19:26:09.822704   48804 command_runner.go:130] >           "Name": "cni-loopback",
	I1201 19:26:09.822709   48804 command_runner.go:130] >           "Plugins": [
	I1201 19:26:09.822712   48804 command_runner.go:130] >             {
	I1201 19:26:09.822717   48804 command_runner.go:130] >               "Network": {
	I1201 19:26:09.822721   48804 command_runner.go:130] >                 "ipam": {},
	I1201 19:26:09.822726   48804 command_runner.go:130] >                 "type": "loopback"
	I1201 19:26:09.822730   48804 command_runner.go:130] >               },
	I1201 19:26:09.822735   48804 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1201 19:26:09.822738   48804 command_runner.go:130] >             }
	I1201 19:26:09.822741   48804 command_runner.go:130] >           ],
	I1201 19:26:09.822751   48804 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1201 19:26:09.822755   48804 command_runner.go:130] >         },
	I1201 19:26:09.822760   48804 command_runner.go:130] >         "IFName": "lo"
	I1201 19:26:09.822764   48804 command_runner.go:130] >       }
	I1201 19:26:09.822771   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822776   48804 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1201 19:26:09.822780   48804 command_runner.go:130] >     "PluginDirs": [
	I1201 19:26:09.822784   48804 command_runner.go:130] >       "/opt/cni/bin"
	I1201 19:26:09.822787   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822792   48804 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1201 19:26:09.822795   48804 command_runner.go:130] >     "Prefix": "eth"
	I1201 19:26:09.822798   48804 command_runner.go:130] >   },
	I1201 19:26:09.822801   48804 command_runner.go:130] >   "config": {
	I1201 19:26:09.822805   48804 command_runner.go:130] >     "cdiSpecDirs": [
	I1201 19:26:09.822809   48804 command_runner.go:130] >       "/etc/cdi",
	I1201 19:26:09.822813   48804 command_runner.go:130] >       "/var/run/cdi"
	I1201 19:26:09.822816   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822823   48804 command_runner.go:130] >     "cni": {
	I1201 19:26:09.822827   48804 command_runner.go:130] >       "binDir": "",
	I1201 19:26:09.822831   48804 command_runner.go:130] >       "binDirs": [
	I1201 19:26:09.822834   48804 command_runner.go:130] >         "/opt/cni/bin"
	I1201 19:26:09.822837   48804 command_runner.go:130] >       ],
	I1201 19:26:09.822842   48804 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1201 19:26:09.822846   48804 command_runner.go:130] >       "confTemplate": "",
	I1201 19:26:09.822849   48804 command_runner.go:130] >       "ipPref": "",
	I1201 19:26:09.822853   48804 command_runner.go:130] >       "maxConfNum": 1,
	I1201 19:26:09.822857   48804 command_runner.go:130] >       "setupSerially": false,
	I1201 19:26:09.822862   48804 command_runner.go:130] >       "useInternalLoopback": false
	I1201 19:26:09.822865   48804 command_runner.go:130] >     },
	I1201 19:26:09.822872   48804 command_runner.go:130] >     "containerd": {
	I1201 19:26:09.822876   48804 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1201 19:26:09.822881   48804 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1201 19:26:09.822886   48804 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1201 19:26:09.822892   48804 command_runner.go:130] >       "runtimes": {
	I1201 19:26:09.822896   48804 command_runner.go:130] >         "runc": {
	I1201 19:26:09.822901   48804 command_runner.go:130] >           "ContainerAnnotations": null,
	I1201 19:26:09.822905   48804 command_runner.go:130] >           "PodAnnotations": null,
	I1201 19:26:09.822914   48804 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1201 19:26:09.822919   48804 command_runner.go:130] >           "cgroupWritable": false,
	I1201 19:26:09.822923   48804 command_runner.go:130] >           "cniConfDir": "",
	I1201 19:26:09.822927   48804 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1201 19:26:09.822931   48804 command_runner.go:130] >           "io_type": "",
	I1201 19:26:09.822934   48804 command_runner.go:130] >           "options": {
	I1201 19:26:09.822939   48804 command_runner.go:130] >             "BinaryName": "",
	I1201 19:26:09.822943   48804 command_runner.go:130] >             "CriuImagePath": "",
	I1201 19:26:09.822947   48804 command_runner.go:130] >             "CriuWorkPath": "",
	I1201 19:26:09.822951   48804 command_runner.go:130] >             "IoGid": 0,
	I1201 19:26:09.822955   48804 command_runner.go:130] >             "IoUid": 0,
	I1201 19:26:09.822959   48804 command_runner.go:130] >             "NoNewKeyring": false,
	I1201 19:26:09.822963   48804 command_runner.go:130] >             "Root": "",
	I1201 19:26:09.822968   48804 command_runner.go:130] >             "ShimCgroup": "",
	I1201 19:26:09.822972   48804 command_runner.go:130] >             "SystemdCgroup": false
	I1201 19:26:09.822975   48804 command_runner.go:130] >           },
	I1201 19:26:09.822980   48804 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1201 19:26:09.822987   48804 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1201 19:26:09.822991   48804 command_runner.go:130] >           "runtimePath": "",
	I1201 19:26:09.822996   48804 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1201 19:26:09.823001   48804 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1201 19:26:09.823005   48804 command_runner.go:130] >           "snapshotter": ""
	I1201 19:26:09.823008   48804 command_runner.go:130] >         }
	I1201 19:26:09.823011   48804 command_runner.go:130] >       }
	I1201 19:26:09.823014   48804 command_runner.go:130] >     },
	I1201 19:26:09.823026   48804 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1201 19:26:09.823032   48804 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1201 19:26:09.823037   48804 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1201 19:26:09.823041   48804 command_runner.go:130] >     "disableApparmor": false,
	I1201 19:26:09.823045   48804 command_runner.go:130] >     "disableHugetlbController": true,
	I1201 19:26:09.823049   48804 command_runner.go:130] >     "disableProcMount": false,
	I1201 19:26:09.823054   48804 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1201 19:26:09.823058   48804 command_runner.go:130] >     "enableCDI": true,
	I1201 19:26:09.823068   48804 command_runner.go:130] >     "enableSelinux": false,
	I1201 19:26:09.823073   48804 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1201 19:26:09.823078   48804 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1201 19:26:09.823091   48804 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1201 19:26:09.823096   48804 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1201 19:26:09.823100   48804 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1201 19:26:09.823105   48804 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1201 19:26:09.823109   48804 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1201 19:26:09.823115   48804 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823119   48804 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1201 19:26:09.823125   48804 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823129   48804 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1201 19:26:09.823135   48804 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1201 19:26:09.823138   48804 command_runner.go:130] >   },
	I1201 19:26:09.823141   48804 command_runner.go:130] >   "features": {
	I1201 19:26:09.823145   48804 command_runner.go:130] >     "supplemental_groups_policy": true
	I1201 19:26:09.823148   48804 command_runner.go:130] >   },
	I1201 19:26:09.823152   48804 command_runner.go:130] >   "golang": "go1.24.9",
	I1201 19:26:09.823162   48804 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823173   48804 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823176   48804 command_runner.go:130] >   "runtimeHandlers": [
	I1201 19:26:09.823179   48804 command_runner.go:130] >     {
	I1201 19:26:09.823183   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823188   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823194   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823197   48804 command_runner.go:130] >       }
	I1201 19:26:09.823199   48804 command_runner.go:130] >     },
	I1201 19:26:09.823202   48804 command_runner.go:130] >     {
	I1201 19:26:09.823206   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823211   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823215   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823218   48804 command_runner.go:130] >       },
	I1201 19:26:09.823221   48804 command_runner.go:130] >       "name": "runc"
	I1201 19:26:09.823228   48804 command_runner.go:130] >     }
	I1201 19:26:09.823231   48804 command_runner.go:130] >   ],
	I1201 19:26:09.823235   48804 command_runner.go:130] >   "status": {
	I1201 19:26:09.823239   48804 command_runner.go:130] >     "conditions": [
	I1201 19:26:09.823242   48804 command_runner.go:130] >       {
	I1201 19:26:09.823245   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823249   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823252   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823257   48804 command_runner.go:130] >         "type": "RuntimeReady"
	I1201 19:26:09.823260   48804 command_runner.go:130] >       },
	I1201 19:26:09.823263   48804 command_runner.go:130] >       {
	I1201 19:26:09.823269   48804 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1201 19:26:09.823274   48804 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1201 19:26:09.823277   48804 command_runner.go:130] >         "status": false,
	I1201 19:26:09.823282   48804 command_runner.go:130] >         "type": "NetworkReady"
	I1201 19:26:09.823285   48804 command_runner.go:130] >       },
	I1201 19:26:09.823288   48804 command_runner.go:130] >       {
	I1201 19:26:09.823292   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823295   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823299   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823305   48804 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1201 19:26:09.823308   48804 command_runner.go:130] >       }
	I1201 19:26:09.823310   48804 command_runner.go:130] >     ]
	I1201 19:26:09.823313   48804 command_runner.go:130] >   }
	I1201 19:26:09.823316   48804 command_runner.go:130] > }
	I1201 19:26:09.824829   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:09.824854   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:09.824874   48804 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:26:09.824897   48804 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:26:09.825029   48804 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:26:09.825110   48804 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:26:09.833035   48804 command_runner.go:130] > kubeadm
	I1201 19:26:09.833056   48804 command_runner.go:130] > kubectl
	I1201 19:26:09.833061   48804 command_runner.go:130] > kubelet
	I1201 19:26:09.833076   48804 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:26:09.833134   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:26:09.840788   48804 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:26:09.853581   48804 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:26:09.866488   48804 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1201 19:26:09.879364   48804 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:26:09.883102   48804 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1201 19:26:09.883255   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:10.007542   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:10.337813   48804 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:26:10.337836   48804 certs.go:195] generating shared ca certs ...
	I1201 19:26:10.337853   48804 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:10.338014   48804 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:26:10.338073   48804 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:26:10.338085   48804 certs.go:257] generating profile certs ...
	I1201 19:26:10.338185   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:26:10.338247   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:26:10.338297   48804 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:26:10.338309   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1201 19:26:10.338322   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1201 19:26:10.338339   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1201 19:26:10.338351   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1201 19:26:10.338365   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1201 19:26:10.338377   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1201 19:26:10.338392   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1201 19:26:10.338406   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1201 19:26:10.338461   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:26:10.338495   48804 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:26:10.338507   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:26:10.338544   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:26:10.338574   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:26:10.338602   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:26:10.338653   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:10.338691   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.338709   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.338720   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem -> /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.339292   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:26:10.367504   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:26:10.391051   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:26:10.410924   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:26:10.429158   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:26:10.447137   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:26:10.464077   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:26:10.481473   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:26:10.498763   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:26:10.516542   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:26:10.534712   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:26:10.552802   48804 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:26:10.565633   48804 ssh_runner.go:195] Run: openssl version
	I1201 19:26:10.571657   48804 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1201 19:26:10.572092   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:26:10.580812   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584562   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584589   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584650   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.625269   48804 command_runner.go:130] > 3ec20f2e
	I1201 19:26:10.625746   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:26:10.633767   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:26:10.642160   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.645995   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646248   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646315   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.686937   48804 command_runner.go:130] > b5213941
	I1201 19:26:10.687439   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:26:10.695499   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:26:10.704517   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708133   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708431   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708519   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.749422   48804 command_runner.go:130] > 51391683
	I1201 19:26:10.749951   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:26:10.758524   48804 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762526   48804 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762565   48804 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1201 19:26:10.762572   48804 command_runner.go:130] > Device: 259,1	Inode: 1053621     Links: 1
	I1201 19:26:10.762579   48804 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:10.762585   48804 command_runner.go:130] > Access: 2025-12-01 19:22:03.818228473 +0000
	I1201 19:26:10.762590   48804 command_runner.go:130] > Modify: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762599   48804 command_runner.go:130] > Change: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762604   48804 command_runner.go:130] >  Birth: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762682   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:26:10.803623   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.804107   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:26:10.845983   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.846486   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:26:10.887221   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.887637   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:26:10.928253   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.928695   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:26:10.970677   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.971198   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:26:11.012420   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:11.012544   48804 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:11.012658   48804 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:26:11.012733   48804 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:26:11.044110   48804 cri.go:89] found id: ""
	I1201 19:26:11.044177   48804 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:26:11.054430   48804 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1201 19:26:11.054508   48804 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1201 19:26:11.054530   48804 command_runner.go:130] > /var/lib/minikube/etcd:
	I1201 19:26:11.054631   48804 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:26:11.054642   48804 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:26:11.054719   48804 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:26:11.063470   48804 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:26:11.063923   48804 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.064051   48804 kubeconfig.go:62] /home/jenkins/minikube-integration/21997-2497/kubeconfig needs updating (will repair): [kubeconfig missing "functional-428744" cluster setting kubeconfig missing "functional-428744" context setting]
	I1201 19:26:11.064410   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.064918   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.065081   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.065855   48804 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1201 19:26:11.065877   48804 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1201 19:26:11.065883   48804 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1201 19:26:11.065889   48804 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1201 19:26:11.065893   48804 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1201 19:26:11.065945   48804 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1201 19:26:11.066161   48804 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:26:11.074525   48804 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1201 19:26:11.074603   48804 kubeadm.go:602] duration metric: took 19.955614ms to restartPrimaryControlPlane
	I1201 19:26:11.074623   48804 kubeadm.go:403] duration metric: took 62.08191ms to StartCluster
	I1201 19:26:11.074644   48804 settings.go:142] acquiring lock: {Name:mk0c68be267fd1e06eeb79721201896d000b433c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.074712   48804 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.075396   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.075623   48804 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 19:26:11.076036   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:11.076070   48804 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1201 19:26:11.076207   48804 addons.go:70] Setting storage-provisioner=true in profile "functional-428744"
	I1201 19:26:11.076225   48804 addons.go:239] Setting addon storage-provisioner=true in "functional-428744"
	I1201 19:26:11.076239   48804 addons.go:70] Setting default-storageclass=true in profile "functional-428744"
	I1201 19:26:11.076254   48804 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-428744"
	I1201 19:26:11.076255   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.076600   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.076785   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.081245   48804 out.go:179] * Verifying Kubernetes components...
	I1201 19:26:11.087150   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:11.117851   48804 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:26:11.119516   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.119671   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.120991   48804 addons.go:239] Setting addon default-storageclass=true in "functional-428744"
	I1201 19:26:11.121044   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.121546   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.121741   48804 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.121759   48804 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1201 19:26:11.121797   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.157953   48804 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:11.157978   48804 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1201 19:26:11.158049   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.182138   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.197665   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.313464   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:11.333888   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.351804   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.088419   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088456   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088499   48804 retry.go:31] will retry after 370.622111ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088535   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088549   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088556   48804 retry.go:31] will retry after 214.864091ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088649   48804 node_ready.go:35] waiting up to 6m0s for node "functional-428744" to be "Ready" ...
	I1201 19:26:12.088787   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.088873   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.089197   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.304654   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.362814   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.366340   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.366413   48804 retry.go:31] will retry after 398.503688ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.459632   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:12.519830   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.523259   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.523294   48804 retry.go:31] will retry after 535.054731ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.589478   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.589570   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.589862   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.765159   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.827324   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.827370   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.827390   48804 retry.go:31] will retry after 739.755241ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.058728   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.089511   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.089585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.089856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.118077   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.118134   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.118154   48804 retry.go:31] will retry after 391.789828ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.510836   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.567332   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:13.570397   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.574026   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.574060   48804 retry.go:31] will retry after 1.18201014s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.589346   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.589417   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.589845   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.644640   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.644678   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.644695   48804 retry.go:31] will retry after 732.335964ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.089422   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.089515   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:14.089961   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:14.377221   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:14.438375   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.438421   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.438440   48804 retry.go:31] will retry after 1.236140087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.589732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.589826   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.590183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:14.756655   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:14.814049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.817149   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.817181   48804 retry.go:31] will retry after 1.12716485s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.089765   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.089856   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.090157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.588981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.675732   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:15.741410   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:15.741450   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.741469   48804 retry.go:31] will retry after 1.409201229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.944883   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:16.007405   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:16.007500   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.007543   48804 retry.go:31] will retry after 1.898784229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.089691   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.089768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.090129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:16.090198   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:16.589482   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.589810   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.089728   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.151412   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:17.212400   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.212446   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.212468   48804 retry.go:31] will retry after 4.05952317s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.588902   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.589279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.906643   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:17.968049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.968156   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.968182   48804 retry.go:31] will retry after 2.840296794s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:18.089284   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.089352   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.089631   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:18.588972   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.589046   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:18.589394   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:19.089061   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.089132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.089421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:19.588859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.588929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.589194   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.088895   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.089306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.808702   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:20.866089   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:20.869253   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:20.869291   48804 retry.go:31] will retry after 4.860979312s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.089785   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.089854   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.090172   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:21.090222   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:21.272551   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:21.327980   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:21.331648   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.331684   48804 retry.go:31] will retry after 4.891109087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.089331   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.089409   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.089753   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.589555   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.589684   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.089701   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.089772   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.090125   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.589808   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.590266   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:23.590323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:24.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.089005   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.089273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:24.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.589377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.089325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.730733   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:25.787142   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:25.790610   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:25.790640   48804 retry.go:31] will retry after 7.92097549s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:26.089409   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:26.223678   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:26.278607   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:26.281989   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.282022   48804 retry.go:31] will retry after 7.531816175s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.589432   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.589521   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.589840   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.089669   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.089751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.090069   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.589693   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.589764   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.590089   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:28.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.089997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.090335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:28.090387   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:28.589510   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.589583   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.589844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.089683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.090056   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.589880   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.589968   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.590369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:30.109683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.109762   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.110054   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:30.110098   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:30.589806   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.590200   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.089177   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.089252   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.089645   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.589198   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.589085   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:32.589565   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:33.089830   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.089902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.090208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.712788   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:33.771136   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.774250   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.774284   48804 retry.go:31] will retry after 5.105632097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.814618   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:33.891338   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.891375   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.891394   48804 retry.go:31] will retry after 5.576720242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:34.089900   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.089994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.090334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:34.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.589260   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:35.088913   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.088982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:35.089359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:35.589057   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.589129   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.589530   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.089310   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:37.089182   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.089255   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:37.089610   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:37.589091   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.589170   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.589433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.089395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.880960   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:38.943302   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:38.943343   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:38.943363   48804 retry.go:31] will retry after 13.228566353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.089598   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.089672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.089960   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:39.090011   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:39.469200   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:39.525826   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:39.528963   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.528998   48804 retry.go:31] will retry after 17.183760318s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.589169   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.589241   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.589577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.089008   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.089433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.089139   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.089214   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.089595   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.589301   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.589750   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:41.589806   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:42.089592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.089667   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.089940   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:42.589720   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.589791   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.590109   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.089757   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.089835   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.090111   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.589514   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.589585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.589848   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:43.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:44.089653   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:44.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.589895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:45.090381   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.090466   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.092630   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1201 19:26:45.589592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.589673   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.590001   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:45.590051   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:46.089834   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.089916   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.090265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:46.588963   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.089324   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.089402   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.089734   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.589563   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.589642   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.590061   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:47.590178   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:48.089732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.089808   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.090071   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:48.589851   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.589928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.590267   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.088857   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.088930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.089271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.590253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:49.590304   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:50.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:50.589030   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.589106   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.089232   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.089302   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.089614   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.589210   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.589283   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.589653   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:52.089565   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.089648   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.089984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:52.090044   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:52.172403   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:52.228163   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:52.231129   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.231163   48804 retry.go:31] will retry after 19.315790709s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.589726   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.589977   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.089744   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.089824   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.589934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.590235   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.589137   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.589243   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.589618   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:54.589675   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:55.089338   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.089423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.089771   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:55.589523   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.589594   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.589856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.089664   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.588804   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.588881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.713576   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:56.772710   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:56.775873   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:56.775910   48804 retry.go:31] will retry after 15.04087383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:57.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.089334   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.089591   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:57.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:57.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.589000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.588867   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.589237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.088980   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.089051   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.089363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.589124   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.589220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.589536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:59.589590   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:00.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.089350   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.089679   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:00.589522   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.589597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.589979   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.088921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.089174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.588991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.589359   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:02.089003   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.089441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:02.089520   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:02.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.588931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:04.089192   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.089265   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:04.089578   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:04.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.589355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.088902   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.088977   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.589296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.088932   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:06.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:07.088819   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.089191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:07.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.589307   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.589807   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.589880   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.590129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:08.590170   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:09.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.089269   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:09.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.589272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.589096   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.589428   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.089194   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.089643   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:11.089702   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:11.547197   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:11.589587   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.589653   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.589873   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.606598   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.609801   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.609839   48804 retry.go:31] will retry after 19.642669348s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.817534   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:11.881682   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.881743   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.881763   48804 retry.go:31] will retry after 44.665994167s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:12.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:12.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.088988   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:13.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:14.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:14.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.088989   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.089075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.089465   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.589182   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.589270   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:15.589609   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:16.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:16.588919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.589317   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.089377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.588846   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.589232   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:18.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:18.089371   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:18.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.589350   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.088809   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.088891   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.089153   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.588895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:20.089911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.089989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.090331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:20.090392   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:20.589054   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.589132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.589374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.089343   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.089681   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.589436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.589535   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.088935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.089210   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.588895   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.588975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:22.589363   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:23.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.089301   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:23.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.589747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.589992   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.089847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.089932   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.090273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.588986   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:24.589445   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:25.089736   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.089809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.090059   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:25.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.588915   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.089346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.588967   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:27.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.089316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:27.089370   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:27.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.089038   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.089114   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:29.089044   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.089124   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.089459   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:29.089532   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:29.589183   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.589521   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.089020   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.089103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.089462   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.088828   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.088907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.089239   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.252679   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:31.310178   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:31.313107   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.313144   48804 retry.go:31] will retry after 31.234541362s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.589652   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.589739   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.590099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:31.590157   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:32.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:32.589064   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.589140   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.089586   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.589302   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.589377   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:34.089480   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.089566   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.089825   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:34.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:34.589708   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.589788   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.088875   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.088959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.089209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.588958   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.589291   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:36.589344   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:37.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.089244   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:37.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.589284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.589536   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.589614   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.589859   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:38.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:39.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.089743   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.090090   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:39.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.590181   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.088979   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.089261   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.589033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.589335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:41.089238   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.089312   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.089670   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:41.089726   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:41.589477   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.589816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.089787   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.089858   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.090183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.088926   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.589305   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:43.589360   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:44.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:44.589583   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.589664   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.589930   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.089936   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.090240   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:45.589423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:46.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:46.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.088993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.089287   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.588825   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.588900   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.589160   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:48.088919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.089001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:48.089402   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:48.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.589148   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.089140   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.089204   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.089439   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.588992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:50.088977   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:50.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:50.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.089222   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.089296   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.089666   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.589233   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.589315   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.589663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:52.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.089519   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.089816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:52.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:52.589625   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.589697   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.089857   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.089935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.090294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.089419   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.588992   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.589064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.589387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:54.589442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:55.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.088988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:55.589056   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.589135   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.589478   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.089109   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.548010   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:56.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.589293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.618422   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621596   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621692   48804 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:27:57.089694   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.089774   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.090105   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:57.090156   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:57.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.089844   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.089911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.090167   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.588968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.089080   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.089152   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.089448   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.589149   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.589228   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.589504   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:59.589556   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:00.089006   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:00.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.089210   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.089282   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.088960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:02.089423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:02.547921   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:28:02.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.589300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.609226   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612351   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612446   48804 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:28:02.615606   48804 out.go:179] * Enabled addons: 
	I1201 19:28:02.619164   48804 addons.go:530] duration metric: took 1m51.54309696s for enable addons: enabled=[]
	I1201 19:28:03.089670   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.090185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:03.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.089110   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.588949   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.589049   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.589402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:04.589461   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:05.089449   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.089546   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.089857   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:05.589588   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.589671   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.589935   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.089746   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.089819   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.090155   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.588853   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.588925   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.589422   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:07.089306   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.089384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.089671   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:07.089725   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:07.589476   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.089738   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.090110   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.589762   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.589829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.590138   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:09.589404   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:10.089052   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.089126   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:10.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.089232   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.089589   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.589170   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.589715   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:11.589763   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:12.089752   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.089829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:12.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.588998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.089832   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.089899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.090285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.588827   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.588899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.589250   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:14.088849   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.089292   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:14.089362   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:14.589658   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.589982   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.089907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.089992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.090441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.589364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:16.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.089055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:16.089598   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:16.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.589342   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.088984   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.589065   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:18.589385   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:19.089029   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:19.588898   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.089123   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.089516   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.588858   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.588926   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:21.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.089230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:21.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:21.588946   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.589356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.089252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.588847   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.588920   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.589241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.588809   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:23.589269   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:24.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:24.588960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.589427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.089763   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.089831   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.090097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.589881   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.589959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.590297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:25.590357   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:26.089013   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.089528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:26.589214   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.589286   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.589603   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.089467   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.089559   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.089881   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.589752   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.590104   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:28.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.089776   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.090051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:28.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:28.589863   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.589941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.590271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.089376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.589270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.088976   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.089446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.589171   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.589249   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.589613   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:30.589671   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:31.089382   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.089449   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.089763   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:31.589556   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.589638   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.589939   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.088836   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:33.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.089356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:33.089416   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:33.588938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.088874   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.089304   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.089364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:35.589306   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:36.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.089000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.089328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:36.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.589327   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.089234   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:37.589407   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:38.089085   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.089167   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.089517   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:38.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.588949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.589220   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.089344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:40.096455   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.096551   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.096874   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:40.097064   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:40.589786   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.589855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.590188   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.089116   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.089196   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.089535   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.589129   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.589203   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.589458   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.089448   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.089553   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.589577   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.589661   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.590007   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:42.590065   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:43.089576   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.089651   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.089904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:43.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.589746   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.590046   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.089837   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.089907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.090256   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.588933   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:45.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.089331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:45.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:45.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.589171   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.089921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.090252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:47.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.089037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.089393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:47.089451   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:47.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.588928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.589192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.088891   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.089303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.589063   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:49.089120   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.089200   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.089463   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:49.089529   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:49.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.588993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.088816   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.088895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.089241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.589245   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:51.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.089220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:51.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:51.589292   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.589374   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.589732   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.089536   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.089603   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.089870   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.589721   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.589798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.590135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.088861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.089284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.588978   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:53.589377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:54.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.089061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.089555   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:54.589299   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.589391   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.589805   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.089595   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.089665   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.089924   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.589751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:55.590097   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:56.089729   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.089807   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:56.589823   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.589890   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.589032   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.589112   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:58.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.089269   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.089543   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:58.089583   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:58.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.589585   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.589652   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.589904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:00.090091   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.090176   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.090503   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:00.090549   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:00.589349   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.589423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.589759   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.089644   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.089715   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.089978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.589828   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.589917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.590306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.588896   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.589271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:02.589323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:03.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:03.589098   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.589185   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.589576   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.089251   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.089329   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.089606   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.589278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:05.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:05.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:05.588996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.589369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.589275   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.088820   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.088892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.089135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.589860   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.589935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.590230   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:07.590276   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:08.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.089375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:08.588887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.588960   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.589213   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.088905   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.088991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.089309   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.589025   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.589102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.589421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:10.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.089125   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:10.089434   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:10.589102   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.589179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.589460   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.089329   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.089406   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.089844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.589591   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.589659   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.088917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:12.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:13.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:13.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.589047   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.089033   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.089449   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.588871   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.588938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:15.089001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:15.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:15.589096   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.589199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.589514   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.089742   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.090072   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.589844   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.589924   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.590265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:17.088934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:17.089471   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:17.589173   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.589246   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.589526   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.088963   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.089323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.589022   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.088922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.089208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.588956   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:19.589431   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:20.089101   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.089182   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.089476   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:20.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.589182   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.089165   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.089546   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.589316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:22.089229   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.089646   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:22.089715   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:22.589537   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.589607   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.589906   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.089700   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.089798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.090113   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.590144   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.088883   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.089296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.589001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.589080   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:24.589410   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:25.088898   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:25.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.089398   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.588893   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.589273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:27.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:27.089379   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:27.589046   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.589122   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.589420   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.089204   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.589028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:29.089056   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.089134   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.089452   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:29.089528   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:29.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.589233   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.589511   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.088929   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.089391   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.589289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:31.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.089217   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.089510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:31.089554   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:31.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.089002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.589582   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.589657   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:33.089719   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.089796   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:33.090228   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:33.588933   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.089059   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.089141   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.089472   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.089077   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.089208   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.089761   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.589549   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.589624   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:35.589927   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:36.089659   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.089734   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:36.588832   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.589251   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.088965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.089289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:38.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.089408   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:38.089459   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:38.588843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.589178   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.088880   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.088961   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.089264   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.588969   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.589385   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.088901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.588965   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.589041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:40.589403   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:41.089288   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.089366   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.089704   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:41.589423   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.589506   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.589815   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.089782   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.089864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.090168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.589534   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:42.589596   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:43.089242   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.089663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:43.589454   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.589549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.589901   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.089759   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.089838   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.090150   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.589175   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:45.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.089091   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:45.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:45.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.088887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.089311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.589071   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.589393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:47.089406   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.089500   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.089826   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:47.089884   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:47.589600   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.589672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.589966   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.089769   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.090162   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.588884   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.589326   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.588995   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.589073   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.589417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:49.589467   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:50.089159   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.089254   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.089647   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:50.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.589215   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.089071   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.089145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.089475   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:52.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.089238   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:52.089288   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:52.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.589750   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.589814   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.590123   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:54.089823   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.089898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.090247   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:54.090303   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:54.588852   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.588930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.089270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.588966   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.089360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.589038   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.589104   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.589401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:56.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:57.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.089090   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:57.588994   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.589111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.089014   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.089394   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.588939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.589358   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:59.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.089299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:59.089369   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:59.589697   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.589768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.089253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:01.089610   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.089745   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:01.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:01.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.589966   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.590319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.089172   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.089260   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.089600   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.088937   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.588977   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.589052   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:03.589478   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:04.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:04.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.589299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.088972   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.589752   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.589827   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:05.590195   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:06.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:06.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.088896   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.089282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:08.089094   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.089179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.089559   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:08.089615   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:08.589268   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.589341   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.589676   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.089519   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.089597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.089926   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.589719   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.589797   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.590134   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.088842   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.088923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:10.589466   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:11.089455   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.089549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.089928   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:11.589660   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.589731   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.589984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.089097   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.089199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.589383   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.589475   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.589880   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:12.589952   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:13.089681   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.089750   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:13.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.590299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.589280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:15.088982   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:15.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:15.589628   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.589698   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.590008   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.089799   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.090158   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.588891   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.588824   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:17.589312   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:18.088965   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.089050   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:18.589104   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.589181   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.589539   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.089070   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.089333   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.589020   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:19.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:20.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.089400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:20.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.589230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.589528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.089246   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.089319   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.089743   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.589336   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.589427   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.589837   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:21.589900   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:22.089716   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.089803   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.090099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:22.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.589969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.590315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.589157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:24.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.089010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.089362   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:24.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:24.589099   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.589172   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.589544   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.089055   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.089127   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.089434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.588952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:26.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.089020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:26.089524   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:26.588879   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.088972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.089314   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.589053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.589130   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.589456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.088844   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.089168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:28.589390   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:29.088940   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.089009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:29.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.589072   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.589384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.089095   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.089945   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.589738   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.590195   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:30.590251   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:31.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.089111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.089438   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:31.588985   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.589056   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.088915   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.588989   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.589324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:33.088946   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.089384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:33.089440   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:33.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.589343   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.089030   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.589373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:35.089090   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.089168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:35.089625   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:35.588821   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.589161   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.088971   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.089321   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.589745   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.589817   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.590097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:37.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.089699   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.089969   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:37.090012   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:37.589546   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.589637   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.589963   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.089733   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.089804   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.090142   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.589803   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.589876   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.088981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.089329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.589036   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.589107   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:39.589515   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:40.088909   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.089345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:40.589047   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.589120   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.089564   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.089897   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.589633   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.589911   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:41.589956   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:42.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.089280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.588990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.588921   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:44.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.089337   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:44.089388   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:44.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.589923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.590187   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.089403   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.589135   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.589226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.589637   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.088870   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.089279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.589345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:46.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:47.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:47.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.589110   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.589189   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.589550   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:48.589608   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:49.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:49.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.588965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.589274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.588817   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.588886   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.589146   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:51.089119   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.089223   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.089571   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:51.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:51.589291   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.089674   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.090013   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.589771   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.589847   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:53.089897   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.089975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.090297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:53.090359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:53.589788   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.589869   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.590118   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.088938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.089272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.588948   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.588997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:55.589383   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:56.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.089147   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.089578   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:56.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.589253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.088920   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:58.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.089773   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.090032   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:58.090073   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:58.589805   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.589877   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.088885   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.088954   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.089285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.589709   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.589783   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.590045   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:00.089976   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.090062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.090455   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:00.090523   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:00.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.589022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.089193   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.089258   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.089567   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.589248   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.589320   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.589696   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.089617   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.089689   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.090033   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.589742   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.589809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.590065   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:02.590107   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:03.089840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.089919   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.090274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:03.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.588964   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.088940   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.089202   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:05.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:05.089397   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:05.589817   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.589881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.590139   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.088913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.089226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.089268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.589804   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:07.590283   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:08.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.089427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:08.588851   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.088893   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.588982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:10.088978   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.089059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.089383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:10.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.589086   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.589443   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.089293   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.089375   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.089754   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.588862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:12.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:12.089477   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:12.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.088863   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.089236   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.589400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.089062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.589792   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.589864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.590157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:14.590205   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:15.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.089998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.090393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:15.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.089755   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.089823   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.090149   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:17.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:17.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:17.589041   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.589117   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.089353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.589024   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.589103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.089036   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.589006   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:19.589433   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:20.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.089045   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:20.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.589892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.590189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.089147   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.089226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:22.089347   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.089422   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.089710   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:22.089757   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:22.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.589640   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.589978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.089774   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.090209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.589840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.589913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.590166   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.089300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.589281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:24.589334   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:25.089831   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.089896   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:25.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.589601   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.589668   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.589943   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:26.589982   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:27.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.088939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:27.588870   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.588951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.089205   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.589061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.589381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:29.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:29.089377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:29.588973   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.088974   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.089053   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.089429   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.589066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:31.089200   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.089577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:31.089637   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:31.588904   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.089340   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.089680   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.589442   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.589524   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.589781   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:33.089603   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.089675   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.089988   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:33.090052   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:33.589773   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.590174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.089801   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.090171   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.589294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.089011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.589740   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.589810   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.590064   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:35.590105   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:36.089859   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.089929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.090255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:36.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.589001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.089192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:38.088964   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:38.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:38.588876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.588953   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.589211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.089039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.089017   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.089088   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.589633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.589707   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.590026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:40.590081   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:41.089028   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:41.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.589178   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.589434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.089390   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.089474   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.089854   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:42.590148   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:43.089748   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.089825   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.090133   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:43.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.589249   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.588882   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.589201   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:45.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.089330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:45.089382   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:45.589179   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.589564   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.089258   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.089345   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.089648   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.589361   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.589436   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.589775   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:47.089602   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.089682   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.090003   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:47.090068   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:47.589765   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.088918   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.089233   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.589032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.089114   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.089186   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.089669   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.589463   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.589550   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.589841   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:49.589889   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:50.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.089706   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.090067   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:50.589698   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.589782   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.590096   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.089244   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:52.088879   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:52.089255   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:52.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.589094   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.589168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.589423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:54.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:54.089441   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:54.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.088999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.089276   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.589378   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:56.088961   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.089405   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:56.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:56.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.588950   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.089074   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.589339   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.089278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.589453   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:58.589546   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:59.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:59.588990   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.589367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.089002   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.089412   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.589613   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.589705   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:00.590166   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:01.088830   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.089237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:01.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.089351   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.089432   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.089784   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.589531   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.589609   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.589892   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:03.089722   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:03.090212   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:03.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.589338   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.089007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.588999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.089026   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.089164   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.089649   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.589145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.589411   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:05.589452   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:06.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.089008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:06.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.089795   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.089860   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.090124   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.588839   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.589229   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:08.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:08.089432   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:08.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.088948   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.589158   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.589644   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:10.089588   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.089666   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.090026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:10.090112   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:10.588810   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.589228   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.089103   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.089180   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.089540   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:12.088890   48804 type.go:168] "Request Body" body=""
	I1201 19:32:12.089251   48804 node_ready.go:38] duration metric: took 6m0.000540563s for node "functional-428744" to be "Ready" ...
	I1201 19:32:12.092425   48804 out.go:203] 
	W1201 19:32:12.095253   48804 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1201 19:32:12.095277   48804 out.go:285] * 
	W1201 19:32:12.097463   48804 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:32:12.100606   48804 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603659604Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603669491Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603679771Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603694613Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603710530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603722672Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603742798Z" level=info msg="runtime interface created"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603748229Z" level=info msg="created NRI interface"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603761193Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603790491Z" level=info msg="Connect containerd service"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.604067131Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.604592254Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.617640027Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.617910752Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.617840619Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.623822712Z" level=info msg="Start recovering state"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647420538Z" level=info msg="Start event monitor"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647478675Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647492172Z" level=info msg="Start streaming server"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647503740Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647512634Z" level=info msg="runtime interface starting up..."
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647519082Z" level=info msg="starting plugins..."
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647530922Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:26:09 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.649749622Z" level=info msg="containerd successfully booted in 0.071246s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:32:13.979945    9056 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:13.980687    9056 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:13.982323    9056 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:13.982670    9056 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:13.984240    9056 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:32:14 up  1:14,  0 user,  load average: 0.41, 0.32, 0.59
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:32:10 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:11 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 808.
	Dec 01 19:32:11 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:11 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:11 functional-428744 kubelet[8940]: E1201 19:32:11.632386    8940 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:11 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:11 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:12 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 809.
	Dec 01 19:32:12 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:12 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:12 functional-428744 kubelet[8946]: E1201 19:32:12.410775    8946 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:12 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:12 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 810.
	Dec 01 19:32:13 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:13 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:13 functional-428744 kubelet[8967]: E1201 19:32:13.162851    8967 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 01 19:32:13 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:13 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:13 functional-428744 kubelet[9036]: E1201 19:32:13.906834    9036 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (437.803785ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart (368.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods (2.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-428744 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-428744 get po -A: exit status 1 (59.711623ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-428744 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-428744 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-428744 get po -A"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (322.235849ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /usr/share/ca-certificates/43052.pem                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh sudo cat /etc/test/nested/copy/4305/hosts                                                                                                 │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image save kicbase/echo-server:functional-019259 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ update-context │ functional-019259 update-context --alsologtostderr -v=2                                                                                                         │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ update-context │ functional-019259 update-context --alsologtostderr -v=2                                                                                                         │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image rm kicbase/echo-server:functional-019259 --alsologtostderr                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image save --daemon kicbase/echo-server:functional-019259 --alsologtostderr                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format yaml --alsologtostderr                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format short --alsologtostderr                                                                                                     │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format json --alsologtostderr                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls --format table --alsologtostderr                                                                                                     │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh            │ functional-019259 ssh pgrep buildkitd                                                                                                                           │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image          │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                          │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image          │ functional-019259 image ls                                                                                                                                      │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete         │ -p functional-019259                                                                                                                                            │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start          │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ start          │ -p functional-428744 --alsologtostderr -v=8                                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:26 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:26:06
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:26:06.760311   48804 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:26:06.760471   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760480   48804 out.go:374] Setting ErrFile to fd 2...
	I1201 19:26:06.760485   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760749   48804 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:26:06.761114   48804 out.go:368] Setting JSON to false
	I1201 19:26:06.761974   48804 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4118,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:26:06.762048   48804 start.go:143] virtualization:  
	I1201 19:26:06.765446   48804 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:26:06.769259   48804 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:26:06.769379   48804 notify.go:221] Checking for updates...
	I1201 19:26:06.775400   48804 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:26:06.778339   48804 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:06.781100   48804 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:26:06.784047   48804 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:26:06.786945   48804 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:26:06.790355   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:06.790504   48804 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:26:06.817889   48804 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:26:06.818002   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.874928   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.865437959 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.875040   48804 docker.go:319] overlay module found
	I1201 19:26:06.878298   48804 out.go:179] * Using the docker driver based on existing profile
	I1201 19:26:06.881322   48804 start.go:309] selected driver: docker
	I1201 19:26:06.881345   48804 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.881455   48804 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:26:06.881703   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.946129   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.93658681 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.946541   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:06.946612   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:06.946692   48804 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.949952   48804 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:26:06.952666   48804 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:26:06.955511   48804 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:26:06.958482   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:06.958560   48804 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:26:06.978189   48804 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:26:06.978215   48804 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:26:07.013576   48804 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:26:07.245550   48804 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:26:07.245729   48804 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:26:07.245814   48804 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245902   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:26:07.245911   48804 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 111.155µs
	I1201 19:26:07.245925   48804 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:26:07.245935   48804 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245965   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:26:07.245971   48804 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 37.068µs
	I1201 19:26:07.245977   48804 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:26:07.245979   48804 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:26:07.245986   48804 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246018   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:26:07.246022   48804 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 37.371µs
	I1201 19:26:07.246020   48804 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246029   48804 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246041   48804 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246063   48804 start.go:364] duration metric: took 29.397µs to acquireMachinesLock for "functional-428744"
	I1201 19:26:07.246076   48804 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:26:07.246081   48804 fix.go:54] fixHost starting: 
	I1201 19:26:07.246083   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:26:07.246089   48804 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 51.212µs
	I1201 19:26:07.246094   48804 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246103   48804 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246129   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:26:07.246135   48804 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 32.744µs
	I1201 19:26:07.246145   48804 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246154   48804 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246179   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:26:07.246184   48804 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.013µs
	I1201 19:26:07.246189   48804 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:26:07.246197   48804 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246221   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:26:07.246225   48804 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 29.356µs
	I1201 19:26:07.246230   48804 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:26:07.246238   48804 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246268   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:26:07.246273   48804 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 35.526µs
	I1201 19:26:07.246278   48804 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:26:07.246288   48804 cache.go:87] Successfully saved all images to host disk.
	I1201 19:26:07.246352   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:07.263626   48804 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:26:07.263658   48804 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:26:07.267042   48804 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:26:07.267094   48804 machine.go:94] provisionDockerMachine start ...
	I1201 19:26:07.267191   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.284298   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.284633   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.284647   48804 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:26:07.445599   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.445668   48804 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:26:07.445742   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.466448   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.466762   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.466780   48804 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:26:07.626795   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.626872   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.646204   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.646540   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.646566   48804 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:26:07.797736   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:26:07.797765   48804 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:26:07.797791   48804 ubuntu.go:190] setting up certificates
	I1201 19:26:07.797801   48804 provision.go:84] configureAuth start
	I1201 19:26:07.797871   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:07.815670   48804 provision.go:143] copyHostCerts
	I1201 19:26:07.815726   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815768   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:26:07.815790   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815876   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:26:07.815970   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.815990   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:26:07.815998   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.816026   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:26:07.816080   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816100   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:26:07.816107   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816131   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:26:07.816190   48804 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:26:07.904001   48804 provision.go:177] copyRemoteCerts
	I1201 19:26:07.904069   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:26:07.904109   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.922469   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.029518   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1201 19:26:08.029579   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:26:08.047419   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1201 19:26:08.047495   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:26:08.069296   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1201 19:26:08.069377   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:26:08.088982   48804 provision.go:87] duration metric: took 291.155414ms to configureAuth
	I1201 19:26:08.089064   48804 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:26:08.089321   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:08.089350   48804 machine.go:97] duration metric: took 822.24428ms to provisionDockerMachine
	I1201 19:26:08.089385   48804 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:26:08.089416   48804 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:26:08.089542   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:26:08.089633   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.112132   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.217325   48804 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:26:08.220778   48804 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1201 19:26:08.220802   48804 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1201 19:26:08.220808   48804 command_runner.go:130] > VERSION_ID="12"
	I1201 19:26:08.220813   48804 command_runner.go:130] > VERSION="12 (bookworm)"
	I1201 19:26:08.220817   48804 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1201 19:26:08.220820   48804 command_runner.go:130] > ID=debian
	I1201 19:26:08.220825   48804 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1201 19:26:08.220831   48804 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1201 19:26:08.220837   48804 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1201 19:26:08.220885   48804 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:26:08.220907   48804 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:26:08.220919   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:26:08.220978   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:26:08.221055   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:26:08.221066   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /etc/ssl/certs/43052.pem
	I1201 19:26:08.221140   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:26:08.221148   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> /etc/test/nested/copy/4305/hosts
	I1201 19:26:08.221198   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:26:08.229002   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:08.246695   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:26:08.263789   48804 start.go:296] duration metric: took 174.371826ms for postStartSetup
	I1201 19:26:08.263869   48804 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:26:08.263931   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.281235   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.382466   48804 command_runner.go:130] > 12%
	I1201 19:26:08.382557   48804 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:26:08.386763   48804 command_runner.go:130] > 172G
	I1201 19:26:08.387182   48804 fix.go:56] duration metric: took 1.141096136s for fixHost
	I1201 19:26:08.387210   48804 start.go:83] releasing machines lock for "functional-428744", held for 1.141138241s
	I1201 19:26:08.387280   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:08.405649   48804 ssh_runner.go:195] Run: cat /version.json
	I1201 19:26:08.405673   48804 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:26:08.405720   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.405736   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.424898   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.435929   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.615638   48804 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1201 19:26:08.615700   48804 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1201 19:26:08.615817   48804 ssh_runner.go:195] Run: systemctl --version
	I1201 19:26:08.621830   48804 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1201 19:26:08.621881   48804 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1201 19:26:08.622279   48804 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1201 19:26:08.626405   48804 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1201 19:26:08.626689   48804 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:26:08.626779   48804 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:26:08.634801   48804 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:26:08.634864   48804 start.go:496] detecting cgroup driver to use...
	I1201 19:26:08.634909   48804 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:26:08.634995   48804 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:26:08.650643   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:26:08.663900   48804 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:26:08.663962   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:26:08.680016   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:26:08.693295   48804 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:26:08.807192   48804 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:26:08.949829   48804 docker.go:234] disabling docker service ...
	I1201 19:26:08.949910   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:26:08.965005   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:26:08.978389   48804 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:26:09.113220   48804 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:26:09.265765   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:26:09.280775   48804 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:26:09.295503   48804 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1201 19:26:09.296833   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:26:09.307263   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:26:09.316009   48804 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:26:09.316129   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:26:09.324849   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.333586   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:26:09.341989   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.350174   48804 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:26:09.358089   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:26:09.366694   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:26:09.375459   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:26:09.384162   48804 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:26:09.390646   48804 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1201 19:26:09.391441   48804 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:26:09.398673   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:09.519779   48804 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:26:09.650665   48804 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:26:09.650790   48804 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:26:09.655039   48804 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1201 19:26:09.655139   48804 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1201 19:26:09.655166   48804 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1201 19:26:09.655199   48804 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:09.655222   48804 command_runner.go:130] > Access: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655243   48804 command_runner.go:130] > Modify: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655266   48804 command_runner.go:130] > Change: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655294   48804 command_runner.go:130] >  Birth: -
	I1201 19:26:09.655330   48804 start.go:564] Will wait 60s for crictl version
	I1201 19:26:09.655409   48804 ssh_runner.go:195] Run: which crictl
	I1201 19:26:09.659043   48804 command_runner.go:130] > /usr/local/bin/crictl
	I1201 19:26:09.659221   48804 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:26:09.684907   48804 command_runner.go:130] > Version:  0.1.0
	I1201 19:26:09.684979   48804 command_runner.go:130] > RuntimeName:  containerd
	I1201 19:26:09.684999   48804 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1201 19:26:09.685021   48804 command_runner.go:130] > RuntimeApiVersion:  v1
	I1201 19:26:09.687516   48804 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:26:09.687623   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.708580   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.710309   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.728879   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.737012   48804 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:26:09.739912   48804 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:26:09.756533   48804 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:26:09.760816   48804 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1201 19:26:09.760978   48804 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:26:09.761088   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:09.761147   48804 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:26:09.788491   48804 command_runner.go:130] > {
	I1201 19:26:09.788509   48804 command_runner.go:130] >   "images":  [
	I1201 19:26:09.788514   48804 command_runner.go:130] >     {
	I1201 19:26:09.788524   48804 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1201 19:26:09.788529   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788534   48804 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1201 19:26:09.788538   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788542   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788546   48804 command_runner.go:130] >       "size":  "8032639",
	I1201 19:26:09.788552   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788556   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788560   48804 command_runner.go:130] >     },
	I1201 19:26:09.788563   48804 command_runner.go:130] >     {
	I1201 19:26:09.788570   48804 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1201 19:26:09.788573   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788578   48804 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1201 19:26:09.788582   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788586   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788598   48804 command_runner.go:130] >       "size":  "21166088",
	I1201 19:26:09.788603   48804 command_runner.go:130] >       "username":  "nonroot",
	I1201 19:26:09.788611   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788615   48804 command_runner.go:130] >     },
	I1201 19:26:09.788617   48804 command_runner.go:130] >     {
	I1201 19:26:09.788624   48804 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1201 19:26:09.788628   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788633   48804 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1201 19:26:09.788636   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788639   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788643   48804 command_runner.go:130] >       "size":  "21134420",
	I1201 19:26:09.788647   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788651   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788654   48804 command_runner.go:130] >       },
	I1201 19:26:09.788658   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788662   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788665   48804 command_runner.go:130] >     },
	I1201 19:26:09.788668   48804 command_runner.go:130] >     {
	I1201 19:26:09.788675   48804 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1201 19:26:09.788678   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788685   48804 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1201 19:26:09.788689   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788692   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788697   48804 command_runner.go:130] >       "size":  "24676285",
	I1201 19:26:09.788700   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788704   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788707   48804 command_runner.go:130] >       },
	I1201 19:26:09.788711   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788715   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788718   48804 command_runner.go:130] >     },
	I1201 19:26:09.788721   48804 command_runner.go:130] >     {
	I1201 19:26:09.788728   48804 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1201 19:26:09.788732   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788739   48804 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1201 19:26:09.788743   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788750   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788755   48804 command_runner.go:130] >       "size":  "20658969",
	I1201 19:26:09.788759   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788762   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788765   48804 command_runner.go:130] >       },
	I1201 19:26:09.788769   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788773   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788776   48804 command_runner.go:130] >     },
	I1201 19:26:09.788779   48804 command_runner.go:130] >     {
	I1201 19:26:09.788786   48804 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1201 19:26:09.788790   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788795   48804 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1201 19:26:09.788799   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788803   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788807   48804 command_runner.go:130] >       "size":  "22428165",
	I1201 19:26:09.788814   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788818   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788822   48804 command_runner.go:130] >     },
	I1201 19:26:09.788825   48804 command_runner.go:130] >     {
	I1201 19:26:09.788832   48804 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1201 19:26:09.788835   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788841   48804 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1201 19:26:09.788844   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788855   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788860   48804 command_runner.go:130] >       "size":  "15389290",
	I1201 19:26:09.788863   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788867   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788870   48804 command_runner.go:130] >       },
	I1201 19:26:09.788874   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788878   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788881   48804 command_runner.go:130] >     },
	I1201 19:26:09.788883   48804 command_runner.go:130] >     {
	I1201 19:26:09.788890   48804 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1201 19:26:09.788897   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788902   48804 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1201 19:26:09.788905   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788908   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788912   48804 command_runner.go:130] >       "size":  "265458",
	I1201 19:26:09.788920   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788924   48804 command_runner.go:130] >         "value":  "65535"
	I1201 19:26:09.788927   48804 command_runner.go:130] >       },
	I1201 19:26:09.788931   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788934   48804 command_runner.go:130] >       "pinned":  true
	I1201 19:26:09.788937   48804 command_runner.go:130] >     }
	I1201 19:26:09.788940   48804 command_runner.go:130] >   ]
	I1201 19:26:09.788943   48804 command_runner.go:130] > }
	I1201 19:26:09.791239   48804 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:26:09.791264   48804 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:26:09.791273   48804 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:26:09.791374   48804 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:26:09.791446   48804 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:26:09.822661   48804 command_runner.go:130] > {
	I1201 19:26:09.822679   48804 command_runner.go:130] >   "cniconfig": {
	I1201 19:26:09.822684   48804 command_runner.go:130] >     "Networks": [
	I1201 19:26:09.822688   48804 command_runner.go:130] >       {
	I1201 19:26:09.822694   48804 command_runner.go:130] >         "Config": {
	I1201 19:26:09.822699   48804 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1201 19:26:09.822704   48804 command_runner.go:130] >           "Name": "cni-loopback",
	I1201 19:26:09.822709   48804 command_runner.go:130] >           "Plugins": [
	I1201 19:26:09.822712   48804 command_runner.go:130] >             {
	I1201 19:26:09.822717   48804 command_runner.go:130] >               "Network": {
	I1201 19:26:09.822721   48804 command_runner.go:130] >                 "ipam": {},
	I1201 19:26:09.822726   48804 command_runner.go:130] >                 "type": "loopback"
	I1201 19:26:09.822730   48804 command_runner.go:130] >               },
	I1201 19:26:09.822735   48804 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1201 19:26:09.822738   48804 command_runner.go:130] >             }
	I1201 19:26:09.822741   48804 command_runner.go:130] >           ],
	I1201 19:26:09.822751   48804 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1201 19:26:09.822755   48804 command_runner.go:130] >         },
	I1201 19:26:09.822760   48804 command_runner.go:130] >         "IFName": "lo"
	I1201 19:26:09.822764   48804 command_runner.go:130] >       }
	I1201 19:26:09.822771   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822776   48804 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1201 19:26:09.822780   48804 command_runner.go:130] >     "PluginDirs": [
	I1201 19:26:09.822784   48804 command_runner.go:130] >       "/opt/cni/bin"
	I1201 19:26:09.822787   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822792   48804 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1201 19:26:09.822795   48804 command_runner.go:130] >     "Prefix": "eth"
	I1201 19:26:09.822798   48804 command_runner.go:130] >   },
	I1201 19:26:09.822801   48804 command_runner.go:130] >   "config": {
	I1201 19:26:09.822805   48804 command_runner.go:130] >     "cdiSpecDirs": [
	I1201 19:26:09.822809   48804 command_runner.go:130] >       "/etc/cdi",
	I1201 19:26:09.822813   48804 command_runner.go:130] >       "/var/run/cdi"
	I1201 19:26:09.822816   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822823   48804 command_runner.go:130] >     "cni": {
	I1201 19:26:09.822827   48804 command_runner.go:130] >       "binDir": "",
	I1201 19:26:09.822831   48804 command_runner.go:130] >       "binDirs": [
	I1201 19:26:09.822834   48804 command_runner.go:130] >         "/opt/cni/bin"
	I1201 19:26:09.822837   48804 command_runner.go:130] >       ],
	I1201 19:26:09.822842   48804 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1201 19:26:09.822846   48804 command_runner.go:130] >       "confTemplate": "",
	I1201 19:26:09.822849   48804 command_runner.go:130] >       "ipPref": "",
	I1201 19:26:09.822853   48804 command_runner.go:130] >       "maxConfNum": 1,
	I1201 19:26:09.822857   48804 command_runner.go:130] >       "setupSerially": false,
	I1201 19:26:09.822862   48804 command_runner.go:130] >       "useInternalLoopback": false
	I1201 19:26:09.822865   48804 command_runner.go:130] >     },
	I1201 19:26:09.822872   48804 command_runner.go:130] >     "containerd": {
	I1201 19:26:09.822876   48804 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1201 19:26:09.822881   48804 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1201 19:26:09.822886   48804 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1201 19:26:09.822892   48804 command_runner.go:130] >       "runtimes": {
	I1201 19:26:09.822896   48804 command_runner.go:130] >         "runc": {
	I1201 19:26:09.822901   48804 command_runner.go:130] >           "ContainerAnnotations": null,
	I1201 19:26:09.822905   48804 command_runner.go:130] >           "PodAnnotations": null,
	I1201 19:26:09.822914   48804 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1201 19:26:09.822919   48804 command_runner.go:130] >           "cgroupWritable": false,
	I1201 19:26:09.822923   48804 command_runner.go:130] >           "cniConfDir": "",
	I1201 19:26:09.822927   48804 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1201 19:26:09.822931   48804 command_runner.go:130] >           "io_type": "",
	I1201 19:26:09.822934   48804 command_runner.go:130] >           "options": {
	I1201 19:26:09.822939   48804 command_runner.go:130] >             "BinaryName": "",
	I1201 19:26:09.822943   48804 command_runner.go:130] >             "CriuImagePath": "",
	I1201 19:26:09.822947   48804 command_runner.go:130] >             "CriuWorkPath": "",
	I1201 19:26:09.822951   48804 command_runner.go:130] >             "IoGid": 0,
	I1201 19:26:09.822955   48804 command_runner.go:130] >             "IoUid": 0,
	I1201 19:26:09.822959   48804 command_runner.go:130] >             "NoNewKeyring": false,
	I1201 19:26:09.822963   48804 command_runner.go:130] >             "Root": "",
	I1201 19:26:09.822968   48804 command_runner.go:130] >             "ShimCgroup": "",
	I1201 19:26:09.822972   48804 command_runner.go:130] >             "SystemdCgroup": false
	I1201 19:26:09.822975   48804 command_runner.go:130] >           },
	I1201 19:26:09.822980   48804 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1201 19:26:09.822987   48804 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1201 19:26:09.822991   48804 command_runner.go:130] >           "runtimePath": "",
	I1201 19:26:09.822996   48804 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1201 19:26:09.823001   48804 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1201 19:26:09.823005   48804 command_runner.go:130] >           "snapshotter": ""
	I1201 19:26:09.823008   48804 command_runner.go:130] >         }
	I1201 19:26:09.823011   48804 command_runner.go:130] >       }
	I1201 19:26:09.823014   48804 command_runner.go:130] >     },
	I1201 19:26:09.823026   48804 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1201 19:26:09.823032   48804 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1201 19:26:09.823037   48804 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1201 19:26:09.823041   48804 command_runner.go:130] >     "disableApparmor": false,
	I1201 19:26:09.823045   48804 command_runner.go:130] >     "disableHugetlbController": true,
	I1201 19:26:09.823049   48804 command_runner.go:130] >     "disableProcMount": false,
	I1201 19:26:09.823054   48804 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1201 19:26:09.823058   48804 command_runner.go:130] >     "enableCDI": true,
	I1201 19:26:09.823068   48804 command_runner.go:130] >     "enableSelinux": false,
	I1201 19:26:09.823073   48804 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1201 19:26:09.823078   48804 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1201 19:26:09.823091   48804 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1201 19:26:09.823096   48804 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1201 19:26:09.823100   48804 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1201 19:26:09.823105   48804 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1201 19:26:09.823109   48804 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1201 19:26:09.823115   48804 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823119   48804 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1201 19:26:09.823125   48804 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823129   48804 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1201 19:26:09.823135   48804 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1201 19:26:09.823138   48804 command_runner.go:130] >   },
	I1201 19:26:09.823141   48804 command_runner.go:130] >   "features": {
	I1201 19:26:09.823145   48804 command_runner.go:130] >     "supplemental_groups_policy": true
	I1201 19:26:09.823148   48804 command_runner.go:130] >   },
	I1201 19:26:09.823152   48804 command_runner.go:130] >   "golang": "go1.24.9",
	I1201 19:26:09.823162   48804 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823173   48804 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823176   48804 command_runner.go:130] >   "runtimeHandlers": [
	I1201 19:26:09.823179   48804 command_runner.go:130] >     {
	I1201 19:26:09.823183   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823188   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823194   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823197   48804 command_runner.go:130] >       }
	I1201 19:26:09.823199   48804 command_runner.go:130] >     },
	I1201 19:26:09.823202   48804 command_runner.go:130] >     {
	I1201 19:26:09.823206   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823211   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823215   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823218   48804 command_runner.go:130] >       },
	I1201 19:26:09.823221   48804 command_runner.go:130] >       "name": "runc"
	I1201 19:26:09.823228   48804 command_runner.go:130] >     }
	I1201 19:26:09.823231   48804 command_runner.go:130] >   ],
	I1201 19:26:09.823235   48804 command_runner.go:130] >   "status": {
	I1201 19:26:09.823239   48804 command_runner.go:130] >     "conditions": [
	I1201 19:26:09.823242   48804 command_runner.go:130] >       {
	I1201 19:26:09.823245   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823249   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823252   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823257   48804 command_runner.go:130] >         "type": "RuntimeReady"
	I1201 19:26:09.823260   48804 command_runner.go:130] >       },
	I1201 19:26:09.823263   48804 command_runner.go:130] >       {
	I1201 19:26:09.823269   48804 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1201 19:26:09.823274   48804 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1201 19:26:09.823277   48804 command_runner.go:130] >         "status": false,
	I1201 19:26:09.823282   48804 command_runner.go:130] >         "type": "NetworkReady"
	I1201 19:26:09.823285   48804 command_runner.go:130] >       },
	I1201 19:26:09.823288   48804 command_runner.go:130] >       {
	I1201 19:26:09.823292   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823295   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823299   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823305   48804 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1201 19:26:09.823308   48804 command_runner.go:130] >       }
	I1201 19:26:09.823310   48804 command_runner.go:130] >     ]
	I1201 19:26:09.823313   48804 command_runner.go:130] >   }
	I1201 19:26:09.823316   48804 command_runner.go:130] > }
	I1201 19:26:09.824829   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:09.824854   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:09.824874   48804 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:26:09.824897   48804 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:26:09.825029   48804 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:26:09.825110   48804 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:26:09.833035   48804 command_runner.go:130] > kubeadm
	I1201 19:26:09.833056   48804 command_runner.go:130] > kubectl
	I1201 19:26:09.833061   48804 command_runner.go:130] > kubelet
	I1201 19:26:09.833076   48804 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:26:09.833134   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:26:09.840788   48804 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:26:09.853581   48804 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:26:09.866488   48804 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1201 19:26:09.879364   48804 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:26:09.883102   48804 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1201 19:26:09.883255   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:10.007542   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:10.337813   48804 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:26:10.337836   48804 certs.go:195] generating shared ca certs ...
	I1201 19:26:10.337853   48804 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:10.338014   48804 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:26:10.338073   48804 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:26:10.338085   48804 certs.go:257] generating profile certs ...
	I1201 19:26:10.338185   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:26:10.338247   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:26:10.338297   48804 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:26:10.338309   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1201 19:26:10.338322   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1201 19:26:10.338339   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1201 19:26:10.338351   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1201 19:26:10.338365   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1201 19:26:10.338377   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1201 19:26:10.338392   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1201 19:26:10.338406   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1201 19:26:10.338461   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:26:10.338495   48804 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:26:10.338507   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:26:10.338544   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:26:10.338574   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:26:10.338602   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:26:10.338653   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:10.338691   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.338709   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.338720   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem -> /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.339292   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:26:10.367504   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:26:10.391051   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:26:10.410924   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:26:10.429158   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:26:10.447137   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:26:10.464077   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:26:10.481473   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:26:10.498763   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:26:10.516542   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:26:10.534712   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:26:10.552802   48804 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:26:10.565633   48804 ssh_runner.go:195] Run: openssl version
	I1201 19:26:10.571657   48804 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1201 19:26:10.572092   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:26:10.580812   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584562   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584589   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584650   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.625269   48804 command_runner.go:130] > 3ec20f2e
	I1201 19:26:10.625746   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:26:10.633767   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:26:10.642160   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.645995   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646248   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646315   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.686937   48804 command_runner.go:130] > b5213941
	I1201 19:26:10.687439   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:26:10.695499   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:26:10.704517   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708133   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708431   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708519   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.749422   48804 command_runner.go:130] > 51391683
	I1201 19:26:10.749951   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:26:10.758524   48804 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762526   48804 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762565   48804 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1201 19:26:10.762572   48804 command_runner.go:130] > Device: 259,1	Inode: 1053621     Links: 1
	I1201 19:26:10.762579   48804 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:10.762585   48804 command_runner.go:130] > Access: 2025-12-01 19:22:03.818228473 +0000
	I1201 19:26:10.762590   48804 command_runner.go:130] > Modify: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762599   48804 command_runner.go:130] > Change: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762604   48804 command_runner.go:130] >  Birth: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762682   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:26:10.803623   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.804107   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:26:10.845983   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.846486   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:26:10.887221   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.887637   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:26:10.928253   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.928695   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:26:10.970677   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.971198   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:26:11.012420   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:11.012544   48804 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:11.012658   48804 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:26:11.012733   48804 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:26:11.044110   48804 cri.go:89] found id: ""
	I1201 19:26:11.044177   48804 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:26:11.054430   48804 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1201 19:26:11.054508   48804 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1201 19:26:11.054530   48804 command_runner.go:130] > /var/lib/minikube/etcd:
	I1201 19:26:11.054631   48804 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:26:11.054642   48804 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:26:11.054719   48804 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:26:11.063470   48804 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:26:11.063923   48804 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.064051   48804 kubeconfig.go:62] /home/jenkins/minikube-integration/21997-2497/kubeconfig needs updating (will repair): [kubeconfig missing "functional-428744" cluster setting kubeconfig missing "functional-428744" context setting]
	I1201 19:26:11.064410   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.064918   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.065081   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.065855   48804 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1201 19:26:11.065877   48804 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1201 19:26:11.065883   48804 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1201 19:26:11.065889   48804 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1201 19:26:11.065893   48804 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1201 19:26:11.065945   48804 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1201 19:26:11.066161   48804 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:26:11.074525   48804 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1201 19:26:11.074603   48804 kubeadm.go:602] duration metric: took 19.955614ms to restartPrimaryControlPlane
	I1201 19:26:11.074623   48804 kubeadm.go:403] duration metric: took 62.08191ms to StartCluster
	I1201 19:26:11.074644   48804 settings.go:142] acquiring lock: {Name:mk0c68be267fd1e06eeb79721201896d000b433c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.074712   48804 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.075396   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.075623   48804 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 19:26:11.076036   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:11.076070   48804 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1201 19:26:11.076207   48804 addons.go:70] Setting storage-provisioner=true in profile "functional-428744"
	I1201 19:26:11.076225   48804 addons.go:239] Setting addon storage-provisioner=true in "functional-428744"
	I1201 19:26:11.076239   48804 addons.go:70] Setting default-storageclass=true in profile "functional-428744"
	I1201 19:26:11.076254   48804 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-428744"
	I1201 19:26:11.076255   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.076600   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.076785   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.081245   48804 out.go:179] * Verifying Kubernetes components...
	I1201 19:26:11.087150   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:11.117851   48804 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:26:11.119516   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.119671   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.120991   48804 addons.go:239] Setting addon default-storageclass=true in "functional-428744"
	I1201 19:26:11.121044   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.121546   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.121741   48804 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.121759   48804 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1201 19:26:11.121797   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.157953   48804 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:11.157978   48804 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1201 19:26:11.158049   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.182138   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.197665   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.313464   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:11.333888   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.351804   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.088419   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088456   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088499   48804 retry.go:31] will retry after 370.622111ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088535   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088549   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088556   48804 retry.go:31] will retry after 214.864091ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088649   48804 node_ready.go:35] waiting up to 6m0s for node "functional-428744" to be "Ready" ...
	I1201 19:26:12.088787   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.088873   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.089197   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.304654   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.362814   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.366340   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.366413   48804 retry.go:31] will retry after 398.503688ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.459632   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:12.519830   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.523259   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.523294   48804 retry.go:31] will retry after 535.054731ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.589478   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.589570   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.589862   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.765159   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.827324   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.827370   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.827390   48804 retry.go:31] will retry after 739.755241ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.058728   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.089511   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.089585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.089856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.118077   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.118134   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.118154   48804 retry.go:31] will retry after 391.789828ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.510836   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.567332   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:13.570397   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.574026   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.574060   48804 retry.go:31] will retry after 1.18201014s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.589346   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.589417   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.589845   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.644640   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.644678   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.644695   48804 retry.go:31] will retry after 732.335964ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.089422   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.089515   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:14.089961   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:14.377221   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:14.438375   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.438421   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.438440   48804 retry.go:31] will retry after 1.236140087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.589732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.589826   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.590183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:14.756655   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:14.814049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.817149   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.817181   48804 retry.go:31] will retry after 1.12716485s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.089765   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.089856   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.090157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.588981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.675732   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:15.741410   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:15.741450   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.741469   48804 retry.go:31] will retry after 1.409201229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.944883   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:16.007405   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:16.007500   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.007543   48804 retry.go:31] will retry after 1.898784229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.089691   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.089768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.090129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:16.090198   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:16.589482   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.589810   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.089728   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.151412   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:17.212400   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.212446   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.212468   48804 retry.go:31] will retry after 4.05952317s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.588902   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.589279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.906643   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:17.968049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.968156   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.968182   48804 retry.go:31] will retry after 2.840296794s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:18.089284   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.089352   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.089631   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:18.588972   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.589046   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:18.589394   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:19.089061   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.089132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.089421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:19.588859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.588929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.589194   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.088895   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.089306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.808702   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:20.866089   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:20.869253   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:20.869291   48804 retry.go:31] will retry after 4.860979312s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.089785   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.089854   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.090172   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:21.090222   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:21.272551   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:21.327980   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:21.331648   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.331684   48804 retry.go:31] will retry after 4.891109087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.089331   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.089409   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.089753   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.589555   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.589684   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.089701   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.089772   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.090125   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.589808   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.590266   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:23.590323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:24.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.089005   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.089273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:24.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.589377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.089325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.730733   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:25.787142   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:25.790610   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:25.790640   48804 retry.go:31] will retry after 7.92097549s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:26.089409   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:26.223678   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:26.278607   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:26.281989   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.282022   48804 retry.go:31] will retry after 7.531816175s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.589432   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.589521   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.589840   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.089669   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.089751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.090069   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.589693   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.589764   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.590089   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:28.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.089997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.090335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:28.090387   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:28.589510   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.589583   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.589844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.089683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.090056   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.589880   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.589968   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.590369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:30.109683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.109762   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.110054   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:30.110098   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:30.589806   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.590200   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.089177   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.089252   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.089645   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.589198   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.589085   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:32.589565   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:33.089830   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.089902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.090208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.712788   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:33.771136   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.774250   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.774284   48804 retry.go:31] will retry after 5.105632097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.814618   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:33.891338   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.891375   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.891394   48804 retry.go:31] will retry after 5.576720242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:34.089900   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.089994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.090334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:34.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.589260   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:35.088913   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.088982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:35.089359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:35.589057   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.589129   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.589530   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.089310   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:37.089182   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.089255   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:37.089610   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:37.589091   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.589170   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.589433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.089395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.880960   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:38.943302   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:38.943343   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:38.943363   48804 retry.go:31] will retry after 13.228566353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.089598   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.089672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.089960   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:39.090011   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:39.469200   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:39.525826   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:39.528963   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.528998   48804 retry.go:31] will retry after 17.183760318s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.589169   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.589241   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.589577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.089008   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.089433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.089139   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.089214   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.089595   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.589301   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.589750   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:41.589806   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:42.089592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.089667   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.089940   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:42.589720   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.589791   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.590109   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.089757   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.089835   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.090111   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.589514   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.589585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.589848   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:43.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:44.089653   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:44.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.589895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:45.090381   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.090466   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.092630   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1201 19:26:45.589592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.589673   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.590001   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:45.590051   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:46.089834   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.089916   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.090265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:46.588963   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.089324   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.089402   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.089734   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.589563   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.589642   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.590061   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:47.590178   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:48.089732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.089808   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.090071   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:48.589851   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.589928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.590267   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.088857   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.088930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.089271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.590253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:49.590304   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:50.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:50.589030   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.589106   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.089232   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.089302   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.089614   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.589210   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.589283   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.589653   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:52.089565   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.089648   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.089984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:52.090044   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:52.172403   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:52.228163   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:52.231129   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.231163   48804 retry.go:31] will retry after 19.315790709s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.589726   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.589977   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.089744   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.089824   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.589934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.590235   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.589137   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.589243   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.589618   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:54.589675   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:55.089338   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.089423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.089771   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:55.589523   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.589594   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.589856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.089664   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.588804   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.588881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.713576   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:56.772710   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:56.775873   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:56.775910   48804 retry.go:31] will retry after 15.04087383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:57.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.089334   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.089591   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:57.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:57.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.589000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.588867   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.589237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.088980   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.089051   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.089363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.589124   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.589220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.589536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:59.589590   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:00.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.089350   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.089679   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:00.589522   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.589597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.589979   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.088921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.089174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.588991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.589359   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:02.089003   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.089441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:02.089520   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:02.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.588931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:04.089192   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.089265   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:04.089578   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:04.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.589355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.088902   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.088977   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.589296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.088932   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:06.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:07.088819   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.089191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:07.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.589307   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.589807   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.589880   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.590129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:08.590170   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:09.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.089269   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:09.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.589272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.589096   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.589428   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.089194   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.089643   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:11.089702   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:11.547197   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:11.589587   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.589653   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.589873   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.606598   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.609801   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.609839   48804 retry.go:31] will retry after 19.642669348s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.817534   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:11.881682   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.881743   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.881763   48804 retry.go:31] will retry after 44.665994167s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:12.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:12.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.088988   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:13.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:14.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:14.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.088989   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.089075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.089465   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.589182   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.589270   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:15.589609   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:16.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:16.588919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.589317   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.089377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.588846   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.589232   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:18.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:18.089371   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:18.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.589350   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.088809   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.088891   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.089153   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.588895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:20.089911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.089989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.090331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:20.090392   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:20.589054   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.589132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.589374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.089343   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.089681   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.589436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.589535   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.088935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.089210   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.588895   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.588975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:22.589363   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:23.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.089301   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:23.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.589747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.589992   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.089847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.089932   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.090273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.588986   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:24.589445   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:25.089736   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.089809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.090059   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:25.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.588915   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.089346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.588967   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:27.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.089316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:27.089370   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:27.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.089038   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.089114   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:29.089044   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.089124   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.089459   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:29.089532   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:29.589183   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.589521   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.089020   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.089103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.089462   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.088828   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.088907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.089239   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.252679   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:31.310178   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:31.313107   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.313144   48804 retry.go:31] will retry after 31.234541362s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.589652   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.589739   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.590099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:31.590157   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:32.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:32.589064   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.589140   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.089586   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.589302   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.589377   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:34.089480   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.089566   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.089825   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:34.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:34.589708   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.589788   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.088875   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.088959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.089209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.588958   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.589291   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:36.589344   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:37.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.089244   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:37.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.589284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.589536   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.589614   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.589859   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:38.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:39.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.089743   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.090090   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:39.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.590181   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.088979   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.089261   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.589033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.589335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:41.089238   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.089312   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.089670   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:41.089726   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:41.589477   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.589816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.089787   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.089858   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.090183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.088926   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.589305   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:43.589360   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:44.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:44.589583   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.589664   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.589930   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.089936   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.090240   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:45.589423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:46.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:46.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.088993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.089287   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.588825   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.588900   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.589160   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:48.088919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.089001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:48.089402   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:48.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.589148   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.089140   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.089204   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.089439   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.588992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:50.088977   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:50.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:50.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.089222   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.089296   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.089666   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.589233   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.589315   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.589663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:52.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.089519   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.089816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:52.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:52.589625   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.589697   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.089857   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.089935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.090294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.089419   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.588992   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.589064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.589387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:54.589442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:55.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.088988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:55.589056   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.589135   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.589478   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.089109   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.548010   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:56.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.589293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.618422   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621596   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621692   48804 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:27:57.089694   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.089774   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.090105   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:57.090156   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:57.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.089844   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.089911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.090167   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.588968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.089080   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.089152   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.089448   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.589149   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.589228   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.589504   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:59.589556   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:00.089006   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:00.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.089210   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.089282   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.088960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:02.089423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:02.547921   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:28:02.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.589300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.609226   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612351   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612446   48804 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:28:02.615606   48804 out.go:179] * Enabled addons: 
	I1201 19:28:02.619164   48804 addons.go:530] duration metric: took 1m51.54309696s for enable addons: enabled=[]
	I1201 19:28:03.089670   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.090185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:03.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.089110   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.588949   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.589049   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.589402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:04.589461   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:05.089449   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.089546   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.089857   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:05.589588   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.589671   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.589935   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.089746   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.089819   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.090155   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.588853   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.588925   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.589422   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:07.089306   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.089384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.089671   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:07.089725   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:07.589476   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.089738   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.090110   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.589762   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.589829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.590138   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:09.589404   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:10.089052   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.089126   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:10.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.089232   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.089589   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.589170   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.589715   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:11.589763   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:12.089752   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.089829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:12.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.588998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.089832   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.089899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.090285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.588827   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.588899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.589250   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:14.088849   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.089292   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:14.089362   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:14.589658   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.589982   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.089907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.089992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.090441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.589364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:16.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.089055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:16.089598   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:16.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.589342   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.088984   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.589065   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:18.589385   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:19.089029   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:19.588898   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.089123   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.089516   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.588858   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.588926   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:21.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.089230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:21.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:21.588946   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.589356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.089252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.588847   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.588920   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.589241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.588809   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:23.589269   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:24.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:24.588960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.589427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.089763   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.089831   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.090097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.589881   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.589959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.590297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:25.590357   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:26.089013   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.089528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:26.589214   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.589286   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.589603   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.089467   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.089559   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.089881   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.589752   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.590104   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:28.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.089776   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.090051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:28.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:28.589863   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.589941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.590271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.089376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.589270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.088976   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.089446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.589171   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.589249   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.589613   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:30.589671   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:31.089382   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.089449   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.089763   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:31.589556   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.589638   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.589939   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.088836   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:33.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.089356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:33.089416   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:33.588938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.088874   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.089304   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.089364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:35.589306   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:36.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.089000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.089328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:36.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.589327   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.089234   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:37.589407   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:38.089085   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.089167   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.089517   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:38.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.588949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.589220   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.089344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:40.096455   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.096551   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.096874   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:40.097064   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:40.589786   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.589855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.590188   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.089116   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.089196   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.089535   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.589129   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.589203   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.589458   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.089448   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.089553   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.589577   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.589661   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.590007   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:42.590065   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:43.089576   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.089651   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.089904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:43.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.589746   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.590046   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.089837   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.089907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.090256   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.588933   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:45.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.089331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:45.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:45.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.589171   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.089921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.090252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:47.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.089037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.089393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:47.089451   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:47.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.588928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.589192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.088891   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.089303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.589063   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:49.089120   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.089200   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.089463   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:49.089529   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:49.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.588993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.088816   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.088895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.089241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.589245   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:51.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.089220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:51.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:51.589292   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.589374   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.589732   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.089536   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.089603   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.089870   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.589721   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.589798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.590135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.088861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.089284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.588978   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:53.589377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:54.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.089061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.089555   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:54.589299   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.589391   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.589805   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.089595   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.089665   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.089924   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.589751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:55.590097   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:56.089729   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.089807   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:56.589823   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.589890   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.589032   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.589112   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:58.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.089269   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.089543   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:58.089583   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:58.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.589585   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.589652   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.589904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:00.090091   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.090176   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.090503   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:00.090549   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:00.589349   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.589423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.589759   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.089644   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.089715   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.089978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.589828   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.589917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.590306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.588896   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.589271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:02.589323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:03.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:03.589098   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.589185   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.589576   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.089251   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.089329   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.089606   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.589278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:05.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:05.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:05.588996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.589369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.589275   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.088820   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.088892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.089135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.589860   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.589935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.590230   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:07.590276   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:08.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.089375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:08.588887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.588960   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.589213   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.088905   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.088991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.089309   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.589025   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.589102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.589421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:10.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.089125   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:10.089434   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:10.589102   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.589179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.589460   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.089329   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.089406   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.089844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.589591   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.589659   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.088917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:12.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:13.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:13.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.589047   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.089033   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.089449   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.588871   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.588938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:15.089001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:15.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:15.589096   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.589199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.589514   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.089742   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.090072   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.589844   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.589924   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.590265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:17.088934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:17.089471   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:17.589173   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.589246   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.589526   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.088963   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.089323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.589022   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.088922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.089208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.588956   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:19.589431   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:20.089101   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.089182   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.089476   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:20.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.589182   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.089165   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.089546   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.589316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:22.089229   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.089646   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:22.089715   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:22.589537   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.589607   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.589906   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.089700   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.089798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.090113   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.590144   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.088883   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.089296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.589001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.589080   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:24.589410   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:25.088898   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:25.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.089398   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.588893   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.589273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:27.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:27.089379   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:27.589046   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.589122   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.589420   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.089204   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.589028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:29.089056   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.089134   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.089452   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:29.089528   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:29.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.589233   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.589511   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.088929   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.089391   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.589289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:31.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.089217   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.089510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:31.089554   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:31.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.089002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.589582   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.589657   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:33.089719   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.089796   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:33.090228   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:33.588933   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.089059   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.089141   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.089472   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.089077   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.089208   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.089761   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.589549   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.589624   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:35.589927   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:36.089659   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.089734   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:36.588832   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.589251   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.088965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.089289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:38.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.089408   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:38.089459   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:38.588843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.589178   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.088880   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.088961   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.089264   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.588969   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.589385   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.088901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.588965   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.589041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:40.589403   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:41.089288   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.089366   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.089704   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:41.589423   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.589506   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.589815   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.089782   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.089864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.090168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.589534   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:42.589596   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:43.089242   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.089663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:43.589454   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.589549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.589901   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.089759   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.089838   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.090150   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.589175   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:45.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.089091   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:45.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:45.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.088887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.089311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.589071   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.589393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:47.089406   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.089500   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.089826   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:47.089884   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:47.589600   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.589672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.589966   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.089769   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.090162   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.588884   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.589326   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.588995   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.589073   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.589417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:49.589467   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:50.089159   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.089254   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.089647   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:50.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.589215   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.089071   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.089145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.089475   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:52.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.089238   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:52.089288   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:52.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.589750   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.589814   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.590123   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:54.089823   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.089898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.090247   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:54.090303   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:54.588852   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.588930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.089270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.588966   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.089360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.589038   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.589104   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.589401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:56.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:57.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.089090   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:57.588994   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.589111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.089014   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.089394   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.588939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.589358   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:59.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.089299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:59.089369   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:59.589697   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.589768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.089253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:01.089610   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.089745   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:01.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:01.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.589966   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.590319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.089172   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.089260   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.089600   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.088937   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.588977   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.589052   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:03.589478   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:04.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:04.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.589299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.088972   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.589752   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.589827   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:05.590195   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:06.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:06.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.088896   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.089282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:08.089094   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.089179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.089559   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:08.089615   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:08.589268   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.589341   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.589676   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.089519   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.089597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.089926   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.589719   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.589797   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.590134   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.088842   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.088923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:10.589466   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:11.089455   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.089549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.089928   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:11.589660   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.589731   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.589984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.089097   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.089199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.589383   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.589475   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.589880   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:12.589952   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:13.089681   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.089750   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:13.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.590299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.589280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:15.088982   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:15.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:15.589628   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.589698   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.590008   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.089799   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.090158   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.588891   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.588824   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:17.589312   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:18.088965   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.089050   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:18.589104   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.589181   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.589539   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.089070   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.089333   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.589020   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:19.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:20.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.089400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:20.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.589230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.589528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.089246   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.089319   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.089743   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.589336   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.589427   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.589837   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:21.589900   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:22.089716   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.089803   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.090099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:22.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.589969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.590315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.589157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:24.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.089010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.089362   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:24.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:24.589099   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.589172   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.589544   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.089055   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.089127   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.089434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.588952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:26.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.089020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:26.089524   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:26.588879   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.088972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.089314   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.589053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.589130   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.589456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.088844   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.089168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:28.589390   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:29.088940   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.089009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:29.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.589072   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.589384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.089095   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.089945   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.589738   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.590195   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:30.590251   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:31.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.089111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.089438   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:31.588985   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.589056   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.088915   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.588989   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.589324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:33.088946   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.089384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:33.089440   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:33.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.589343   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.089030   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.589373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:35.089090   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.089168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:35.089625   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:35.588821   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.589161   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.088971   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.089321   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.589745   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.589817   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.590097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:37.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.089699   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.089969   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:37.090012   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:37.589546   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.589637   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.589963   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.089733   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.089804   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.090142   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.589803   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.589876   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.088981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.089329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.589036   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.589107   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:39.589515   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:40.088909   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.089345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:40.589047   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.589120   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.089564   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.089897   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.589633   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.589911   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:41.589956   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:42.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.089280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.588990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.588921   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:44.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.089337   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:44.089388   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:44.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.589923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.590187   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.089403   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.589135   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.589226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.589637   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.088870   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.089279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.589345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:46.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:47.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:47.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.589110   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.589189   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.589550   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:48.589608   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:49.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:49.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.588965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.589274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.588817   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.588886   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.589146   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:51.089119   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.089223   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.089571   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:51.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:51.589291   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.089674   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.090013   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.589771   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.589847   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:53.089897   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.089975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.090297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:53.090359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:53.589788   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.589869   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.590118   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.088938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.089272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.588948   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.588997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:55.589383   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:56.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.089147   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.089578   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:56.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.589253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.088920   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:58.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.089773   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.090032   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:58.090073   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:58.589805   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.589877   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.088885   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.088954   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.089285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.589709   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.589783   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.590045   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:00.089976   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.090062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.090455   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:00.090523   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:00.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.589022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.089193   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.089258   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.089567   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.589248   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.589320   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.589696   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.089617   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.089689   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.090033   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.589742   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.589809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.590065   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:02.590107   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:03.089840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.089919   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.090274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:03.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.588964   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.088940   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.089202   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:05.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:05.089397   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:05.589817   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.589881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.590139   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.088913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.089226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.089268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.589804   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:07.590283   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:08.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.089427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:08.588851   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.088893   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.588982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:10.088978   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.089059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.089383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:10.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.589086   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.589443   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.089293   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.089375   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.089754   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.588862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:12.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:12.089477   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:12.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.088863   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.089236   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.589400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.089062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.589792   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.589864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.590157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:14.590205   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:15.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.089998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.090393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:15.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.089755   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.089823   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.090149   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:17.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:17.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:17.589041   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.589117   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.089353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.589024   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.589103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.089036   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.589006   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:19.589433   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:20.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.089045   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:20.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.589892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.590189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.089147   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.089226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:22.089347   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.089422   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.089710   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:22.089757   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:22.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.589640   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.589978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.089774   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.090209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.589840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.589913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.590166   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.089300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.589281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:24.589334   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:25.089831   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.089896   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:25.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.589601   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.589668   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.589943   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:26.589982   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:27.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.088939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:27.588870   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.588951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.089205   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.589061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.589381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:29.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:29.089377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:29.588973   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.088974   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.089053   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.089429   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.589066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:31.089200   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.089577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:31.089637   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:31.588904   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.089340   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.089680   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.589442   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.589524   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.589781   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:33.089603   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.089675   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.089988   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:33.090052   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:33.589773   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.590174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.089801   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.090171   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.589294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.089011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.589740   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.589810   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.590064   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:35.590105   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:36.089859   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.089929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.090255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:36.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.589001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.089192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:38.088964   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:38.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:38.588876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.588953   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.589211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.089039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.089017   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.089088   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.589633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.589707   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.590026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:40.590081   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:41.089028   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:41.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.589178   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.589434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.089390   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.089474   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.089854   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:42.590148   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:43.089748   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.089825   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.090133   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:43.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.589249   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.588882   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.589201   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:45.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.089330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:45.089382   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:45.589179   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.589564   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.089258   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.089345   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.089648   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.589361   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.589436   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.589775   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:47.089602   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.089682   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.090003   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:47.090068   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:47.589765   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.088918   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.089233   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.589032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.089114   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.089186   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.089669   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.589463   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.589550   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.589841   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:49.589889   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:50.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.089706   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.090067   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:50.589698   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.589782   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.590096   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.089244   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:52.088879   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:52.089255   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:52.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.589094   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.589168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.589423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:54.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:54.089441   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:54.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.088999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.089276   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.589378   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:56.088961   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.089405   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:56.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:56.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.588950   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.089074   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.589339   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.089278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.589453   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:58.589546   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:59.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:59.588990   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.589367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.089002   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.089412   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.589613   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.589705   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:00.590166   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:01.088830   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.089237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:01.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.089351   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.089432   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.089784   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.589531   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.589609   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.589892   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:03.089722   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:03.090212   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:03.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.589338   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.089007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.588999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.089026   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.089164   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.089649   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.589145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.589411   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:05.589452   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:06.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.089008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:06.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.089795   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.089860   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.090124   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.588839   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.589229   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:08.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:08.089432   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:08.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.088948   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.589158   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.589644   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:10.089588   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.089666   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.090026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:10.090112   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:10.588810   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.589228   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.089103   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.089180   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.089540   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:12.088890   48804 type.go:168] "Request Body" body=""
	I1201 19:32:12.089251   48804 node_ready.go:38] duration metric: took 6m0.000540563s for node "functional-428744" to be "Ready" ...
	I1201 19:32:12.092425   48804 out.go:203] 
	W1201 19:32:12.095253   48804 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1201 19:32:12.095277   48804 out.go:285] * 
	W1201 19:32:12.097463   48804 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:32:12.100606   48804 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603659604Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603669491Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603679771Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603694613Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603710530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603722672Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603742798Z" level=info msg="runtime interface created"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603748229Z" level=info msg="created NRI interface"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603761193Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.603790491Z" level=info msg="Connect containerd service"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.604067131Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.604592254Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.617640027Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.617910752Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.617840619Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.623822712Z" level=info msg="Start recovering state"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647420538Z" level=info msg="Start event monitor"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647478675Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647492172Z" level=info msg="Start streaming server"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647503740Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647512634Z" level=info msg="runtime interface starting up..."
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647519082Z" level=info msg="starting plugins..."
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.647530922Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:26:09 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 01 19:26:09 functional-428744 containerd[5833]: time="2025-12-01T19:26:09.649749622Z" level=info msg="containerd successfully booted in 0.071246s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:32:16.409077    9194 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:16.410263    9194 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:16.411022    9194 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:16.412499    9194 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:16.412862    9194 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:32:16 up  1:14,  0 user,  load average: 0.41, 0.32, 0.59
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 01 19:32:13 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:13 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:13 functional-428744 kubelet[9036]: E1201 19:32:13.906834    9036 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:13 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:14 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Dec 01 19:32:14 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:14 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:14 functional-428744 kubelet[9070]: E1201 19:32:14.665530    9070 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:14 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:14 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:15 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 813.
	Dec 01 19:32:15 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:15 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:15 functional-428744 kubelet[9090]: E1201 19:32:15.404832    9090 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:15 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:15 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:16 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 814.
	Dec 01 19:32:16 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:16 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:16 functional-428744 kubelet[9117]: E1201 19:32:16.150527    9117 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:16 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:16 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (348.263167ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods (2.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd (2.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 kubectl -- --context functional-428744 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 kubectl -- --context functional-428744 get pods: exit status 1 (111.35444ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-428744 kubectl -- --context functional-428744 get pods": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (358.331044ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 logs -n 25: (1.003533743s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-019259 image ls --format yaml --alsologtostderr                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format short --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format json --alsologtostderr                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format table --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh     │ functional-019259 ssh pgrep buildkitd                                                                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image   │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                  │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls                                                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete  │ -p functional-019259                                                                                                                                    │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start   │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ start   │ -p functional-428744 --alsologtostderr -v=8                                                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:26 UTC │                     │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:latest                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add minikube-local-cache-test:functional-428744                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache delete minikube-local-cache-test:functional-428744                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl images                                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ cache   │ functional-428744 cache reload                                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ kubectl │ functional-428744 kubectl -- --context functional-428744 get pods                                                                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:26:06
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:26:06.760311   48804 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:26:06.760471   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760480   48804 out.go:374] Setting ErrFile to fd 2...
	I1201 19:26:06.760485   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760749   48804 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:26:06.761114   48804 out.go:368] Setting JSON to false
	I1201 19:26:06.761974   48804 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4118,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:26:06.762048   48804 start.go:143] virtualization:  
	I1201 19:26:06.765446   48804 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:26:06.769259   48804 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:26:06.769379   48804 notify.go:221] Checking for updates...
	I1201 19:26:06.775400   48804 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:26:06.778339   48804 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:06.781100   48804 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:26:06.784047   48804 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:26:06.786945   48804 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:26:06.790355   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:06.790504   48804 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:26:06.817889   48804 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:26:06.818002   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.874928   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.865437959 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.875040   48804 docker.go:319] overlay module found
	I1201 19:26:06.878298   48804 out.go:179] * Using the docker driver based on existing profile
	I1201 19:26:06.881322   48804 start.go:309] selected driver: docker
	I1201 19:26:06.881345   48804 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.881455   48804 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:26:06.881703   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.946129   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.93658681 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.946541   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:06.946612   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:06.946692   48804 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.949952   48804 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:26:06.952666   48804 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:26:06.955511   48804 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:26:06.958482   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:06.958560   48804 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:26:06.978189   48804 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:26:06.978215   48804 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:26:07.013576   48804 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:26:07.245550   48804 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:26:07.245729   48804 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:26:07.245814   48804 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245902   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:26:07.245911   48804 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 111.155µs
	I1201 19:26:07.245925   48804 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:26:07.245935   48804 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245965   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:26:07.245971   48804 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 37.068µs
	I1201 19:26:07.245977   48804 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:26:07.245979   48804 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:26:07.245986   48804 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246018   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:26:07.246022   48804 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 37.371µs
	I1201 19:26:07.246020   48804 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246029   48804 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246041   48804 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246063   48804 start.go:364] duration metric: took 29.397µs to acquireMachinesLock for "functional-428744"
	I1201 19:26:07.246076   48804 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:26:07.246081   48804 fix.go:54] fixHost starting: 
	I1201 19:26:07.246083   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:26:07.246089   48804 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 51.212µs
	I1201 19:26:07.246094   48804 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246103   48804 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246129   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:26:07.246135   48804 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 32.744µs
	I1201 19:26:07.246145   48804 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246154   48804 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246179   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:26:07.246184   48804 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.013µs
	I1201 19:26:07.246189   48804 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:26:07.246197   48804 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246221   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:26:07.246225   48804 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 29.356µs
	I1201 19:26:07.246230   48804 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:26:07.246238   48804 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246268   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:26:07.246273   48804 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 35.526µs
	I1201 19:26:07.246278   48804 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:26:07.246288   48804 cache.go:87] Successfully saved all images to host disk.
	I1201 19:26:07.246352   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:07.263626   48804 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:26:07.263658   48804 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:26:07.267042   48804 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:26:07.267094   48804 machine.go:94] provisionDockerMachine start ...
	I1201 19:26:07.267191   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.284298   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.284633   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.284647   48804 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:26:07.445599   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.445668   48804 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:26:07.445742   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.466448   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.466762   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.466780   48804 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:26:07.626795   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.626872   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.646204   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.646540   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.646566   48804 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:26:07.797736   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:26:07.797765   48804 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:26:07.797791   48804 ubuntu.go:190] setting up certificates
	I1201 19:26:07.797801   48804 provision.go:84] configureAuth start
	I1201 19:26:07.797871   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:07.815670   48804 provision.go:143] copyHostCerts
	I1201 19:26:07.815726   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815768   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:26:07.815790   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815876   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:26:07.815970   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.815990   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:26:07.815998   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.816026   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:26:07.816080   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816100   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:26:07.816107   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816131   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:26:07.816190   48804 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:26:07.904001   48804 provision.go:177] copyRemoteCerts
	I1201 19:26:07.904069   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:26:07.904109   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.922469   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.029518   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1201 19:26:08.029579   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:26:08.047419   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1201 19:26:08.047495   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:26:08.069296   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1201 19:26:08.069377   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:26:08.088982   48804 provision.go:87] duration metric: took 291.155414ms to configureAuth
	I1201 19:26:08.089064   48804 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:26:08.089321   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:08.089350   48804 machine.go:97] duration metric: took 822.24428ms to provisionDockerMachine
	I1201 19:26:08.089385   48804 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:26:08.089416   48804 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:26:08.089542   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:26:08.089633   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.112132   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.217325   48804 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:26:08.220778   48804 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1201 19:26:08.220802   48804 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1201 19:26:08.220808   48804 command_runner.go:130] > VERSION_ID="12"
	I1201 19:26:08.220813   48804 command_runner.go:130] > VERSION="12 (bookworm)"
	I1201 19:26:08.220817   48804 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1201 19:26:08.220820   48804 command_runner.go:130] > ID=debian
	I1201 19:26:08.220825   48804 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1201 19:26:08.220831   48804 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1201 19:26:08.220837   48804 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1201 19:26:08.220885   48804 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:26:08.220907   48804 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:26:08.220919   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:26:08.220978   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:26:08.221055   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:26:08.221066   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /etc/ssl/certs/43052.pem
	I1201 19:26:08.221140   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:26:08.221148   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> /etc/test/nested/copy/4305/hosts
	I1201 19:26:08.221198   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:26:08.229002   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:08.246695   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:26:08.263789   48804 start.go:296] duration metric: took 174.371826ms for postStartSetup
	I1201 19:26:08.263869   48804 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:26:08.263931   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.281235   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.382466   48804 command_runner.go:130] > 12%
	I1201 19:26:08.382557   48804 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:26:08.386763   48804 command_runner.go:130] > 172G
	I1201 19:26:08.387182   48804 fix.go:56] duration metric: took 1.141096136s for fixHost
	I1201 19:26:08.387210   48804 start.go:83] releasing machines lock for "functional-428744", held for 1.141138241s
	I1201 19:26:08.387280   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:08.405649   48804 ssh_runner.go:195] Run: cat /version.json
	I1201 19:26:08.405673   48804 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:26:08.405720   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.405736   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.424898   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.435929   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.615638   48804 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1201 19:26:08.615700   48804 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1201 19:26:08.615817   48804 ssh_runner.go:195] Run: systemctl --version
	I1201 19:26:08.621830   48804 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1201 19:26:08.621881   48804 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1201 19:26:08.622279   48804 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1201 19:26:08.626405   48804 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1201 19:26:08.626689   48804 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:26:08.626779   48804 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:26:08.634801   48804 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:26:08.634864   48804 start.go:496] detecting cgroup driver to use...
	I1201 19:26:08.634909   48804 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:26:08.634995   48804 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:26:08.650643   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:26:08.663900   48804 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:26:08.663962   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:26:08.680016   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:26:08.693295   48804 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:26:08.807192   48804 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:26:08.949829   48804 docker.go:234] disabling docker service ...
	I1201 19:26:08.949910   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:26:08.965005   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:26:08.978389   48804 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:26:09.113220   48804 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:26:09.265765   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:26:09.280775   48804 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:26:09.295503   48804 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1201 19:26:09.296833   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:26:09.307263   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:26:09.316009   48804 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:26:09.316129   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:26:09.324849   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.333586   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:26:09.341989   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.350174   48804 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:26:09.358089   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:26:09.366694   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:26:09.375459   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:26:09.384162   48804 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:26:09.390646   48804 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1201 19:26:09.391441   48804 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:26:09.398673   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:09.519779   48804 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:26:09.650665   48804 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:26:09.650790   48804 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:26:09.655039   48804 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1201 19:26:09.655139   48804 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1201 19:26:09.655166   48804 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1201 19:26:09.655199   48804 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:09.655222   48804 command_runner.go:130] > Access: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655243   48804 command_runner.go:130] > Modify: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655266   48804 command_runner.go:130] > Change: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655294   48804 command_runner.go:130] >  Birth: -
	I1201 19:26:09.655330   48804 start.go:564] Will wait 60s for crictl version
	I1201 19:26:09.655409   48804 ssh_runner.go:195] Run: which crictl
	I1201 19:26:09.659043   48804 command_runner.go:130] > /usr/local/bin/crictl
	I1201 19:26:09.659221   48804 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:26:09.684907   48804 command_runner.go:130] > Version:  0.1.0
	I1201 19:26:09.684979   48804 command_runner.go:130] > RuntimeName:  containerd
	I1201 19:26:09.684999   48804 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1201 19:26:09.685021   48804 command_runner.go:130] > RuntimeApiVersion:  v1
	I1201 19:26:09.687516   48804 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:26:09.687623   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.708580   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.710309   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.728879   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.737012   48804 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:26:09.739912   48804 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:26:09.756533   48804 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:26:09.760816   48804 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1201 19:26:09.760978   48804 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:26:09.761088   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:09.761147   48804 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:26:09.788491   48804 command_runner.go:130] > {
	I1201 19:26:09.788509   48804 command_runner.go:130] >   "images":  [
	I1201 19:26:09.788514   48804 command_runner.go:130] >     {
	I1201 19:26:09.788524   48804 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1201 19:26:09.788529   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788534   48804 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1201 19:26:09.788538   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788542   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788546   48804 command_runner.go:130] >       "size":  "8032639",
	I1201 19:26:09.788552   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788556   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788560   48804 command_runner.go:130] >     },
	I1201 19:26:09.788563   48804 command_runner.go:130] >     {
	I1201 19:26:09.788570   48804 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1201 19:26:09.788573   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788578   48804 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1201 19:26:09.788582   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788586   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788598   48804 command_runner.go:130] >       "size":  "21166088",
	I1201 19:26:09.788603   48804 command_runner.go:130] >       "username":  "nonroot",
	I1201 19:26:09.788611   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788615   48804 command_runner.go:130] >     },
	I1201 19:26:09.788617   48804 command_runner.go:130] >     {
	I1201 19:26:09.788624   48804 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1201 19:26:09.788628   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788633   48804 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1201 19:26:09.788636   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788639   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788643   48804 command_runner.go:130] >       "size":  "21134420",
	I1201 19:26:09.788647   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788651   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788654   48804 command_runner.go:130] >       },
	I1201 19:26:09.788658   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788662   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788665   48804 command_runner.go:130] >     },
	I1201 19:26:09.788668   48804 command_runner.go:130] >     {
	I1201 19:26:09.788675   48804 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1201 19:26:09.788678   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788685   48804 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1201 19:26:09.788689   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788692   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788697   48804 command_runner.go:130] >       "size":  "24676285",
	I1201 19:26:09.788700   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788704   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788707   48804 command_runner.go:130] >       },
	I1201 19:26:09.788711   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788715   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788718   48804 command_runner.go:130] >     },
	I1201 19:26:09.788721   48804 command_runner.go:130] >     {
	I1201 19:26:09.788728   48804 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1201 19:26:09.788732   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788739   48804 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1201 19:26:09.788743   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788750   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788755   48804 command_runner.go:130] >       "size":  "20658969",
	I1201 19:26:09.788759   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788762   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788765   48804 command_runner.go:130] >       },
	I1201 19:26:09.788769   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788773   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788776   48804 command_runner.go:130] >     },
	I1201 19:26:09.788779   48804 command_runner.go:130] >     {
	I1201 19:26:09.788786   48804 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1201 19:26:09.788790   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788795   48804 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1201 19:26:09.788799   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788803   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788807   48804 command_runner.go:130] >       "size":  "22428165",
	I1201 19:26:09.788814   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788818   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788822   48804 command_runner.go:130] >     },
	I1201 19:26:09.788825   48804 command_runner.go:130] >     {
	I1201 19:26:09.788832   48804 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1201 19:26:09.788835   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788841   48804 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1201 19:26:09.788844   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788855   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788860   48804 command_runner.go:130] >       "size":  "15389290",
	I1201 19:26:09.788863   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788867   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788870   48804 command_runner.go:130] >       },
	I1201 19:26:09.788874   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788878   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788881   48804 command_runner.go:130] >     },
	I1201 19:26:09.788883   48804 command_runner.go:130] >     {
	I1201 19:26:09.788890   48804 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1201 19:26:09.788897   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788902   48804 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1201 19:26:09.788905   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788908   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788912   48804 command_runner.go:130] >       "size":  "265458",
	I1201 19:26:09.788920   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788924   48804 command_runner.go:130] >         "value":  "65535"
	I1201 19:26:09.788927   48804 command_runner.go:130] >       },
	I1201 19:26:09.788931   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788934   48804 command_runner.go:130] >       "pinned":  true
	I1201 19:26:09.788937   48804 command_runner.go:130] >     }
	I1201 19:26:09.788940   48804 command_runner.go:130] >   ]
	I1201 19:26:09.788943   48804 command_runner.go:130] > }
	I1201 19:26:09.791239   48804 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:26:09.791264   48804 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:26:09.791273   48804 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:26:09.791374   48804 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:26:09.791446   48804 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:26:09.822661   48804 command_runner.go:130] > {
	I1201 19:26:09.822679   48804 command_runner.go:130] >   "cniconfig": {
	I1201 19:26:09.822684   48804 command_runner.go:130] >     "Networks": [
	I1201 19:26:09.822688   48804 command_runner.go:130] >       {
	I1201 19:26:09.822694   48804 command_runner.go:130] >         "Config": {
	I1201 19:26:09.822699   48804 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1201 19:26:09.822704   48804 command_runner.go:130] >           "Name": "cni-loopback",
	I1201 19:26:09.822709   48804 command_runner.go:130] >           "Plugins": [
	I1201 19:26:09.822712   48804 command_runner.go:130] >             {
	I1201 19:26:09.822717   48804 command_runner.go:130] >               "Network": {
	I1201 19:26:09.822721   48804 command_runner.go:130] >                 "ipam": {},
	I1201 19:26:09.822726   48804 command_runner.go:130] >                 "type": "loopback"
	I1201 19:26:09.822730   48804 command_runner.go:130] >               },
	I1201 19:26:09.822735   48804 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1201 19:26:09.822738   48804 command_runner.go:130] >             }
	I1201 19:26:09.822741   48804 command_runner.go:130] >           ],
	I1201 19:26:09.822751   48804 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1201 19:26:09.822755   48804 command_runner.go:130] >         },
	I1201 19:26:09.822760   48804 command_runner.go:130] >         "IFName": "lo"
	I1201 19:26:09.822764   48804 command_runner.go:130] >       }
	I1201 19:26:09.822771   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822776   48804 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1201 19:26:09.822780   48804 command_runner.go:130] >     "PluginDirs": [
	I1201 19:26:09.822784   48804 command_runner.go:130] >       "/opt/cni/bin"
	I1201 19:26:09.822787   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822792   48804 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1201 19:26:09.822795   48804 command_runner.go:130] >     "Prefix": "eth"
	I1201 19:26:09.822798   48804 command_runner.go:130] >   },
	I1201 19:26:09.822801   48804 command_runner.go:130] >   "config": {
	I1201 19:26:09.822805   48804 command_runner.go:130] >     "cdiSpecDirs": [
	I1201 19:26:09.822809   48804 command_runner.go:130] >       "/etc/cdi",
	I1201 19:26:09.822813   48804 command_runner.go:130] >       "/var/run/cdi"
	I1201 19:26:09.822816   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822823   48804 command_runner.go:130] >     "cni": {
	I1201 19:26:09.822827   48804 command_runner.go:130] >       "binDir": "",
	I1201 19:26:09.822831   48804 command_runner.go:130] >       "binDirs": [
	I1201 19:26:09.822834   48804 command_runner.go:130] >         "/opt/cni/bin"
	I1201 19:26:09.822837   48804 command_runner.go:130] >       ],
	I1201 19:26:09.822842   48804 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1201 19:26:09.822846   48804 command_runner.go:130] >       "confTemplate": "",
	I1201 19:26:09.822849   48804 command_runner.go:130] >       "ipPref": "",
	I1201 19:26:09.822853   48804 command_runner.go:130] >       "maxConfNum": 1,
	I1201 19:26:09.822857   48804 command_runner.go:130] >       "setupSerially": false,
	I1201 19:26:09.822862   48804 command_runner.go:130] >       "useInternalLoopback": false
	I1201 19:26:09.822865   48804 command_runner.go:130] >     },
	I1201 19:26:09.822872   48804 command_runner.go:130] >     "containerd": {
	I1201 19:26:09.822876   48804 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1201 19:26:09.822881   48804 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1201 19:26:09.822886   48804 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1201 19:26:09.822892   48804 command_runner.go:130] >       "runtimes": {
	I1201 19:26:09.822896   48804 command_runner.go:130] >         "runc": {
	I1201 19:26:09.822901   48804 command_runner.go:130] >           "ContainerAnnotations": null,
	I1201 19:26:09.822905   48804 command_runner.go:130] >           "PodAnnotations": null,
	I1201 19:26:09.822914   48804 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1201 19:26:09.822919   48804 command_runner.go:130] >           "cgroupWritable": false,
	I1201 19:26:09.822923   48804 command_runner.go:130] >           "cniConfDir": "",
	I1201 19:26:09.822927   48804 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1201 19:26:09.822931   48804 command_runner.go:130] >           "io_type": "",
	I1201 19:26:09.822934   48804 command_runner.go:130] >           "options": {
	I1201 19:26:09.822939   48804 command_runner.go:130] >             "BinaryName": "",
	I1201 19:26:09.822943   48804 command_runner.go:130] >             "CriuImagePath": "",
	I1201 19:26:09.822947   48804 command_runner.go:130] >             "CriuWorkPath": "",
	I1201 19:26:09.822951   48804 command_runner.go:130] >             "IoGid": 0,
	I1201 19:26:09.822955   48804 command_runner.go:130] >             "IoUid": 0,
	I1201 19:26:09.822959   48804 command_runner.go:130] >             "NoNewKeyring": false,
	I1201 19:26:09.822963   48804 command_runner.go:130] >             "Root": "",
	I1201 19:26:09.822968   48804 command_runner.go:130] >             "ShimCgroup": "",
	I1201 19:26:09.822972   48804 command_runner.go:130] >             "SystemdCgroup": false
	I1201 19:26:09.822975   48804 command_runner.go:130] >           },
	I1201 19:26:09.822980   48804 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1201 19:26:09.822987   48804 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1201 19:26:09.822991   48804 command_runner.go:130] >           "runtimePath": "",
	I1201 19:26:09.822996   48804 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1201 19:26:09.823001   48804 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1201 19:26:09.823005   48804 command_runner.go:130] >           "snapshotter": ""
	I1201 19:26:09.823008   48804 command_runner.go:130] >         }
	I1201 19:26:09.823011   48804 command_runner.go:130] >       }
	I1201 19:26:09.823014   48804 command_runner.go:130] >     },
	I1201 19:26:09.823026   48804 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1201 19:26:09.823032   48804 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1201 19:26:09.823037   48804 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1201 19:26:09.823041   48804 command_runner.go:130] >     "disableApparmor": false,
	I1201 19:26:09.823045   48804 command_runner.go:130] >     "disableHugetlbController": true,
	I1201 19:26:09.823049   48804 command_runner.go:130] >     "disableProcMount": false,
	I1201 19:26:09.823054   48804 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1201 19:26:09.823058   48804 command_runner.go:130] >     "enableCDI": true,
	I1201 19:26:09.823068   48804 command_runner.go:130] >     "enableSelinux": false,
	I1201 19:26:09.823073   48804 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1201 19:26:09.823078   48804 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1201 19:26:09.823091   48804 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1201 19:26:09.823096   48804 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1201 19:26:09.823100   48804 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1201 19:26:09.823105   48804 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1201 19:26:09.823109   48804 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1201 19:26:09.823115   48804 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823119   48804 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1201 19:26:09.823125   48804 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823129   48804 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1201 19:26:09.823135   48804 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1201 19:26:09.823138   48804 command_runner.go:130] >   },
	I1201 19:26:09.823141   48804 command_runner.go:130] >   "features": {
	I1201 19:26:09.823145   48804 command_runner.go:130] >     "supplemental_groups_policy": true
	I1201 19:26:09.823148   48804 command_runner.go:130] >   },
	I1201 19:26:09.823152   48804 command_runner.go:130] >   "golang": "go1.24.9",
	I1201 19:26:09.823162   48804 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823173   48804 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823176   48804 command_runner.go:130] >   "runtimeHandlers": [
	I1201 19:26:09.823179   48804 command_runner.go:130] >     {
	I1201 19:26:09.823183   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823188   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823194   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823197   48804 command_runner.go:130] >       }
	I1201 19:26:09.823199   48804 command_runner.go:130] >     },
	I1201 19:26:09.823202   48804 command_runner.go:130] >     {
	I1201 19:26:09.823206   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823211   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823215   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823218   48804 command_runner.go:130] >       },
	I1201 19:26:09.823221   48804 command_runner.go:130] >       "name": "runc"
	I1201 19:26:09.823228   48804 command_runner.go:130] >     }
	I1201 19:26:09.823231   48804 command_runner.go:130] >   ],
	I1201 19:26:09.823235   48804 command_runner.go:130] >   "status": {
	I1201 19:26:09.823239   48804 command_runner.go:130] >     "conditions": [
	I1201 19:26:09.823242   48804 command_runner.go:130] >       {
	I1201 19:26:09.823245   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823249   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823252   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823257   48804 command_runner.go:130] >         "type": "RuntimeReady"
	I1201 19:26:09.823260   48804 command_runner.go:130] >       },
	I1201 19:26:09.823263   48804 command_runner.go:130] >       {
	I1201 19:26:09.823269   48804 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1201 19:26:09.823274   48804 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1201 19:26:09.823277   48804 command_runner.go:130] >         "status": false,
	I1201 19:26:09.823282   48804 command_runner.go:130] >         "type": "NetworkReady"
	I1201 19:26:09.823285   48804 command_runner.go:130] >       },
	I1201 19:26:09.823288   48804 command_runner.go:130] >       {
	I1201 19:26:09.823292   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823295   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823299   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823305   48804 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1201 19:26:09.823308   48804 command_runner.go:130] >       }
	I1201 19:26:09.823310   48804 command_runner.go:130] >     ]
	I1201 19:26:09.823313   48804 command_runner.go:130] >   }
	I1201 19:26:09.823316   48804 command_runner.go:130] > }
	I1201 19:26:09.824829   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:09.824854   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:09.824874   48804 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:26:09.824897   48804 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:26:09.825029   48804 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:26:09.825110   48804 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:26:09.833035   48804 command_runner.go:130] > kubeadm
	I1201 19:26:09.833056   48804 command_runner.go:130] > kubectl
	I1201 19:26:09.833061   48804 command_runner.go:130] > kubelet
	I1201 19:26:09.833076   48804 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:26:09.833134   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:26:09.840788   48804 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:26:09.853581   48804 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:26:09.866488   48804 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1201 19:26:09.879364   48804 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:26:09.883102   48804 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1201 19:26:09.883255   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:10.007542   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:10.337813   48804 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:26:10.337836   48804 certs.go:195] generating shared ca certs ...
	I1201 19:26:10.337853   48804 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:10.338014   48804 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:26:10.338073   48804 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:26:10.338085   48804 certs.go:257] generating profile certs ...
	I1201 19:26:10.338185   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:26:10.338247   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:26:10.338297   48804 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:26:10.338309   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1201 19:26:10.338322   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1201 19:26:10.338339   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1201 19:26:10.338351   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1201 19:26:10.338365   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1201 19:26:10.338377   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1201 19:26:10.338392   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1201 19:26:10.338406   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1201 19:26:10.338461   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:26:10.338495   48804 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:26:10.338507   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:26:10.338544   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:26:10.338574   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:26:10.338602   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:26:10.338653   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:10.338691   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.338709   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.338720   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem -> /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.339292   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:26:10.367504   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:26:10.391051   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:26:10.410924   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:26:10.429158   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:26:10.447137   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:26:10.464077   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:26:10.481473   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:26:10.498763   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:26:10.516542   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:26:10.534712   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:26:10.552802   48804 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:26:10.565633   48804 ssh_runner.go:195] Run: openssl version
	I1201 19:26:10.571657   48804 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1201 19:26:10.572092   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:26:10.580812   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584562   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584589   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584650   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.625269   48804 command_runner.go:130] > 3ec20f2e
	I1201 19:26:10.625746   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:26:10.633767   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:26:10.642160   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.645995   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646248   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646315   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.686937   48804 command_runner.go:130] > b5213941
	I1201 19:26:10.687439   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:26:10.695499   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:26:10.704517   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708133   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708431   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708519   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.749422   48804 command_runner.go:130] > 51391683
	I1201 19:26:10.749951   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:26:10.758524   48804 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762526   48804 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762565   48804 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1201 19:26:10.762572   48804 command_runner.go:130] > Device: 259,1	Inode: 1053621     Links: 1
	I1201 19:26:10.762579   48804 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:10.762585   48804 command_runner.go:130] > Access: 2025-12-01 19:22:03.818228473 +0000
	I1201 19:26:10.762590   48804 command_runner.go:130] > Modify: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762599   48804 command_runner.go:130] > Change: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762604   48804 command_runner.go:130] >  Birth: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762682   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:26:10.803623   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.804107   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:26:10.845983   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.846486   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:26:10.887221   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.887637   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:26:10.928253   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.928695   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:26:10.970677   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.971198   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:26:11.012420   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:11.012544   48804 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:11.012658   48804 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:26:11.012733   48804 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:26:11.044110   48804 cri.go:89] found id: ""
	I1201 19:26:11.044177   48804 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:26:11.054430   48804 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1201 19:26:11.054508   48804 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1201 19:26:11.054530   48804 command_runner.go:130] > /var/lib/minikube/etcd:
	I1201 19:26:11.054631   48804 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:26:11.054642   48804 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:26:11.054719   48804 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:26:11.063470   48804 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:26:11.063923   48804 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.064051   48804 kubeconfig.go:62] /home/jenkins/minikube-integration/21997-2497/kubeconfig needs updating (will repair): [kubeconfig missing "functional-428744" cluster setting kubeconfig missing "functional-428744" context setting]
	I1201 19:26:11.064410   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.064918   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.065081   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.065855   48804 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1201 19:26:11.065877   48804 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1201 19:26:11.065883   48804 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1201 19:26:11.065889   48804 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1201 19:26:11.065893   48804 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1201 19:26:11.065945   48804 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1201 19:26:11.066161   48804 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:26:11.074525   48804 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1201 19:26:11.074603   48804 kubeadm.go:602] duration metric: took 19.955614ms to restartPrimaryControlPlane
	I1201 19:26:11.074623   48804 kubeadm.go:403] duration metric: took 62.08191ms to StartCluster
	I1201 19:26:11.074644   48804 settings.go:142] acquiring lock: {Name:mk0c68be267fd1e06eeb79721201896d000b433c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.074712   48804 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.075396   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.075623   48804 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 19:26:11.076036   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:11.076070   48804 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1201 19:26:11.076207   48804 addons.go:70] Setting storage-provisioner=true in profile "functional-428744"
	I1201 19:26:11.076225   48804 addons.go:239] Setting addon storage-provisioner=true in "functional-428744"
	I1201 19:26:11.076239   48804 addons.go:70] Setting default-storageclass=true in profile "functional-428744"
	I1201 19:26:11.076254   48804 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-428744"
	I1201 19:26:11.076255   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.076600   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.076785   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.081245   48804 out.go:179] * Verifying Kubernetes components...
	I1201 19:26:11.087150   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:11.117851   48804 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:26:11.119516   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.119671   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.120991   48804 addons.go:239] Setting addon default-storageclass=true in "functional-428744"
	I1201 19:26:11.121044   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.121546   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.121741   48804 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.121759   48804 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1201 19:26:11.121797   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.157953   48804 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:11.157978   48804 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1201 19:26:11.158049   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.182138   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.197665   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.313464   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:11.333888   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.351804   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.088419   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088456   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088499   48804 retry.go:31] will retry after 370.622111ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088535   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088549   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088556   48804 retry.go:31] will retry after 214.864091ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088649   48804 node_ready.go:35] waiting up to 6m0s for node "functional-428744" to be "Ready" ...
	I1201 19:26:12.088787   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.088873   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.089197   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.304654   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.362814   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.366340   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.366413   48804 retry.go:31] will retry after 398.503688ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.459632   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:12.519830   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.523259   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.523294   48804 retry.go:31] will retry after 535.054731ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.589478   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.589570   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.589862   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.765159   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.827324   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.827370   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.827390   48804 retry.go:31] will retry after 739.755241ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.058728   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.089511   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.089585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.089856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.118077   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.118134   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.118154   48804 retry.go:31] will retry after 391.789828ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.510836   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.567332   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:13.570397   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.574026   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.574060   48804 retry.go:31] will retry after 1.18201014s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.589346   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.589417   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.589845   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.644640   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.644678   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.644695   48804 retry.go:31] will retry after 732.335964ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.089422   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.089515   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:14.089961   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:14.377221   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:14.438375   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.438421   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.438440   48804 retry.go:31] will retry after 1.236140087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.589732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.589826   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.590183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:14.756655   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:14.814049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.817149   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.817181   48804 retry.go:31] will retry after 1.12716485s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.089765   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.089856   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.090157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.588981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.675732   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:15.741410   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:15.741450   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.741469   48804 retry.go:31] will retry after 1.409201229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.944883   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:16.007405   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:16.007500   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.007543   48804 retry.go:31] will retry after 1.898784229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.089691   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.089768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.090129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:16.090198   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:16.589482   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.589810   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.089728   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.151412   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:17.212400   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.212446   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.212468   48804 retry.go:31] will retry after 4.05952317s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.588902   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.589279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.906643   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:17.968049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.968156   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.968182   48804 retry.go:31] will retry after 2.840296794s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:18.089284   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.089352   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.089631   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:18.588972   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.589046   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:18.589394   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:19.089061   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.089132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.089421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:19.588859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.588929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.589194   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.088895   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.089306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.808702   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:20.866089   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:20.869253   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:20.869291   48804 retry.go:31] will retry after 4.860979312s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.089785   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.089854   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.090172   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:21.090222   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:21.272551   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:21.327980   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:21.331648   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.331684   48804 retry.go:31] will retry after 4.891109087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.089331   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.089409   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.089753   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.589555   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.589684   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.089701   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.089772   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.090125   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.589808   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.590266   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:23.590323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:24.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.089005   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.089273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:24.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.589377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.089325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.730733   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:25.787142   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:25.790610   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:25.790640   48804 retry.go:31] will retry after 7.92097549s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:26.089409   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:26.223678   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:26.278607   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:26.281989   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.282022   48804 retry.go:31] will retry after 7.531816175s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.589432   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.589521   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.589840   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.089669   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.089751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.090069   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.589693   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.589764   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.590089   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:28.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.089997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.090335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:28.090387   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:28.589510   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.589583   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.589844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.089683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.090056   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.589880   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.589968   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.590369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:30.109683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.109762   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.110054   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:30.110098   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:30.589806   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.590200   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.089177   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.089252   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.089645   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.589198   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.589085   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:32.589565   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:33.089830   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.089902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.090208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.712788   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:33.771136   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.774250   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.774284   48804 retry.go:31] will retry after 5.105632097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.814618   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:33.891338   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.891375   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.891394   48804 retry.go:31] will retry after 5.576720242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:34.089900   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.089994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.090334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:34.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.589260   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:35.088913   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.088982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:35.089359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:35.589057   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.589129   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.589530   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.089310   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:37.089182   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.089255   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:37.089610   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:37.589091   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.589170   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.589433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.089395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.880960   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:38.943302   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:38.943343   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:38.943363   48804 retry.go:31] will retry after 13.228566353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.089598   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.089672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.089960   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:39.090011   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:39.469200   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:39.525826   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:39.528963   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.528998   48804 retry.go:31] will retry after 17.183760318s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.589169   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.589241   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.589577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.089008   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.089433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.089139   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.089214   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.089595   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.589301   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.589750   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:41.589806   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:42.089592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.089667   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.089940   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:42.589720   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.589791   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.590109   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.089757   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.089835   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.090111   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.589514   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.589585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.589848   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:43.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:44.089653   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:44.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.589895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:45.090381   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.090466   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.092630   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1201 19:26:45.589592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.589673   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.590001   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:45.590051   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:46.089834   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.089916   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.090265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:46.588963   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.089324   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.089402   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.089734   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.589563   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.589642   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.590061   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:47.590178   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:48.089732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.089808   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.090071   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:48.589851   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.589928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.590267   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.088857   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.088930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.089271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.590253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:49.590304   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:50.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:50.589030   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.589106   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.089232   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.089302   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.089614   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.589210   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.589283   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.589653   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:52.089565   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.089648   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.089984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:52.090044   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:52.172403   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:52.228163   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:52.231129   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.231163   48804 retry.go:31] will retry after 19.315790709s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.589726   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.589977   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.089744   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.089824   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.589934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.590235   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.589137   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.589243   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.589618   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:54.589675   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:55.089338   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.089423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.089771   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:55.589523   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.589594   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.589856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.089664   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.588804   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.588881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.713576   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:56.772710   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:56.775873   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:56.775910   48804 retry.go:31] will retry after 15.04087383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:57.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.089334   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.089591   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:57.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:57.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.589000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.588867   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.589237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.088980   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.089051   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.089363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.589124   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.589220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.589536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:59.589590   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:00.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.089350   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.089679   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:00.589522   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.589597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.589979   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.088921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.089174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.588991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.589359   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:02.089003   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.089441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:02.089520   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:02.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.588931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:04.089192   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.089265   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:04.089578   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:04.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.589355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.088902   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.088977   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.589296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.088932   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:06.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:07.088819   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.089191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:07.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.589307   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.589807   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.589880   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.590129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:08.590170   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:09.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.089269   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:09.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.589272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.589096   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.589428   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.089194   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.089643   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:11.089702   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:11.547197   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:11.589587   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.589653   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.589873   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.606598   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.609801   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.609839   48804 retry.go:31] will retry after 19.642669348s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.817534   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:11.881682   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.881743   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.881763   48804 retry.go:31] will retry after 44.665994167s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:12.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:12.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.088988   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:13.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:14.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:14.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.088989   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.089075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.089465   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.589182   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.589270   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:15.589609   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:16.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:16.588919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.589317   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.089377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.588846   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.589232   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:18.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:18.089371   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:18.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.589350   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.088809   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.088891   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.089153   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.588895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:20.089911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.089989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.090331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:20.090392   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:20.589054   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.589132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.589374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.089343   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.089681   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.589436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.589535   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.088935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.089210   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.588895   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.588975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:22.589363   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:23.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.089301   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:23.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.589747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.589992   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.089847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.089932   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.090273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.588986   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:24.589445   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:25.089736   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.089809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.090059   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:25.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.588915   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.089346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.588967   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:27.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.089316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:27.089370   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:27.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.089038   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.089114   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:29.089044   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.089124   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.089459   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:29.089532   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:29.589183   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.589521   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.089020   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.089103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.089462   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.088828   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.088907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.089239   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.252679   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:31.310178   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:31.313107   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.313144   48804 retry.go:31] will retry after 31.234541362s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.589652   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.589739   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.590099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:31.590157   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:32.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:32.589064   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.589140   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.089586   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.589302   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.589377   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:34.089480   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.089566   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.089825   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:34.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:34.589708   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.589788   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.088875   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.088959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.089209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.588958   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.589291   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:36.589344   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:37.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.089244   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:37.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.589284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.589536   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.589614   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.589859   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:38.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:39.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.089743   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.090090   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:39.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.590181   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.088979   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.089261   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.589033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.589335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:41.089238   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.089312   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.089670   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:41.089726   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:41.589477   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.589816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.089787   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.089858   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.090183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.088926   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.589305   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:43.589360   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:44.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:44.589583   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.589664   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.589930   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.089936   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.090240   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:45.589423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:46.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:46.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.088993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.089287   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.588825   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.588900   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.589160   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:48.088919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.089001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:48.089402   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:48.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.589148   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.089140   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.089204   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.089439   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.588992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:50.088977   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:50.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:50.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.089222   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.089296   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.089666   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.589233   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.589315   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.589663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:52.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.089519   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.089816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:52.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:52.589625   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.589697   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.089857   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.089935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.090294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.089419   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.588992   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.589064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.589387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:54.589442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:55.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.088988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:55.589056   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.589135   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.589478   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.089109   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.548010   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:56.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.589293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.618422   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621596   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621692   48804 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:27:57.089694   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.089774   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.090105   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:57.090156   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:57.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.089844   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.089911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.090167   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.588968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.089080   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.089152   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.089448   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.589149   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.589228   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.589504   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:59.589556   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:00.089006   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:00.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.089210   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.089282   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.088960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:02.089423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:02.547921   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:28:02.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.589300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.609226   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612351   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612446   48804 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:28:02.615606   48804 out.go:179] * Enabled addons: 
	I1201 19:28:02.619164   48804 addons.go:530] duration metric: took 1m51.54309696s for enable addons: enabled=[]
	I1201 19:28:03.089670   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.090185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:03.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.089110   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.588949   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.589049   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.589402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:04.589461   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:05.089449   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.089546   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.089857   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:05.589588   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.589671   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.589935   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.089746   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.089819   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.090155   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.588853   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.588925   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.589422   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:07.089306   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.089384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.089671   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:07.089725   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:07.589476   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.089738   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.090110   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.589762   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.589829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.590138   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:09.589404   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:10.089052   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.089126   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:10.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.089232   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.089589   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.589170   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.589715   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:11.589763   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:12.089752   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.089829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:12.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.588998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.089832   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.089899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.090285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.588827   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.588899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.589250   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:14.088849   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.089292   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:14.089362   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:14.589658   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.589982   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.089907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.089992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.090441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.589364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:16.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.089055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:16.089598   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:16.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.589342   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.088984   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.589065   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:18.589385   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:19.089029   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:19.588898   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.089123   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.089516   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.588858   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.588926   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:21.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.089230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:21.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:21.588946   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.589356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.089252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.588847   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.588920   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.589241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.588809   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:23.589269   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:24.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:24.588960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.589427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.089763   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.089831   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.090097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.589881   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.589959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.590297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:25.590357   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:26.089013   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.089528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:26.589214   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.589286   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.589603   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.089467   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.089559   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.089881   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.589752   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.590104   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:28.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.089776   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.090051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:28.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:28.589863   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.589941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.590271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.089376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.589270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.088976   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.089446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.589171   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.589249   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.589613   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:30.589671   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:31.089382   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.089449   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.089763   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:31.589556   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.589638   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.589939   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.088836   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:33.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.089356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:33.089416   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:33.588938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.088874   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.089304   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.089364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:35.589306   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:36.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.089000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.089328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:36.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.589327   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.089234   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:37.589407   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:38.089085   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.089167   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.089517   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:38.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.588949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.589220   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.089344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:40.096455   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.096551   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.096874   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:40.097064   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:40.589786   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.589855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.590188   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.089116   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.089196   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.089535   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.589129   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.589203   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.589458   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.089448   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.089553   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.589577   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.589661   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.590007   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:42.590065   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:43.089576   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.089651   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.089904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:43.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.589746   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.590046   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.089837   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.089907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.090256   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.588933   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:45.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.089331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:45.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:45.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.589171   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.089921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.090252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:47.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.089037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.089393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:47.089451   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:47.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.588928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.589192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.088891   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.089303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.589063   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:49.089120   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.089200   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.089463   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:49.089529   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:49.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.588993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.088816   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.088895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.089241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.589245   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:51.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.089220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:51.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:51.589292   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.589374   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.589732   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.089536   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.089603   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.089870   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.589721   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.589798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.590135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.088861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.089284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.588978   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:53.589377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:54.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.089061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.089555   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:54.589299   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.589391   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.589805   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.089595   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.089665   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.089924   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.589751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:55.590097   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:56.089729   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.089807   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:56.589823   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.589890   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.589032   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.589112   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:58.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.089269   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.089543   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:58.089583   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:58.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.589585   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.589652   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.589904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:00.090091   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.090176   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.090503   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:00.090549   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:00.589349   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.589423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.589759   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.089644   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.089715   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.089978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.589828   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.589917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.590306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.588896   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.589271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:02.589323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:03.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:03.589098   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.589185   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.589576   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.089251   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.089329   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.089606   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.589278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:05.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:05.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:05.588996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.589369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.589275   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.088820   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.088892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.089135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.589860   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.589935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.590230   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:07.590276   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:08.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.089375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:08.588887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.588960   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.589213   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.088905   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.088991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.089309   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.589025   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.589102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.589421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:10.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.089125   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:10.089434   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:10.589102   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.589179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.589460   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.089329   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.089406   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.089844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.589591   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.589659   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.088917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:12.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:13.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:13.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.589047   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.089033   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.089449   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.588871   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.588938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:15.089001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:15.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:15.589096   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.589199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.589514   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.089742   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.090072   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.589844   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.589924   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.590265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:17.088934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:17.089471   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:17.589173   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.589246   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.589526   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.088963   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.089323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.589022   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.088922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.089208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.588956   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:19.589431   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:20.089101   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.089182   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.089476   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:20.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.589182   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.089165   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.089546   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.589316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:22.089229   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.089646   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:22.089715   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:22.589537   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.589607   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.589906   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.089700   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.089798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.090113   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.590144   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.088883   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.089296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.589001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.589080   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:24.589410   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:25.088898   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:25.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.089398   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.588893   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.589273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:27.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:27.089379   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:27.589046   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.589122   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.589420   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.089204   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.589028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:29.089056   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.089134   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.089452   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:29.089528   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:29.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.589233   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.589511   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.088929   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.089391   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.589289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:31.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.089217   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.089510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:31.089554   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:31.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.089002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.589582   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.589657   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:33.089719   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.089796   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:33.090228   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:33.588933   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.089059   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.089141   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.089472   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.089077   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.089208   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.089761   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.589549   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.589624   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:35.589927   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:36.089659   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.089734   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:36.588832   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.589251   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.088965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.089289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:38.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.089408   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:38.089459   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:38.588843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.589178   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.088880   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.088961   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.089264   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.588969   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.589385   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.088901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.588965   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.589041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:40.589403   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:41.089288   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.089366   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.089704   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:41.589423   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.589506   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.589815   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.089782   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.089864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.090168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.589534   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:42.589596   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:43.089242   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.089663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:43.589454   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.589549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.589901   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.089759   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.089838   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.090150   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.589175   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:45.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.089091   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:45.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:45.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.088887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.089311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.589071   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.589393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:47.089406   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.089500   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.089826   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:47.089884   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:47.589600   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.589672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.589966   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.089769   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.090162   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.588884   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.589326   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.588995   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.589073   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.589417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:49.589467   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:50.089159   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.089254   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.089647   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:50.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.589215   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.089071   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.089145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.089475   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:52.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.089238   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:52.089288   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:52.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.589750   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.589814   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.590123   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:54.089823   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.089898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.090247   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:54.090303   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:54.588852   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.588930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.089270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.588966   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.089360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.589038   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.589104   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.589401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:56.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:57.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.089090   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:57.588994   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.589111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.089014   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.089394   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.588939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.589358   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:59.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.089299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:59.089369   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:59.589697   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.589768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.089253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:01.089610   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.089745   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:01.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:01.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.589966   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.590319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.089172   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.089260   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.089600   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.088937   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.588977   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.589052   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:03.589478   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:04.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:04.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.589299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.088972   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.589752   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.589827   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:05.590195   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:06.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:06.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.088896   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.089282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:08.089094   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.089179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.089559   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:08.089615   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:08.589268   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.589341   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.589676   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.089519   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.089597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.089926   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.589719   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.589797   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.590134   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.088842   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.088923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:10.589466   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:11.089455   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.089549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.089928   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:11.589660   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.589731   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.589984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.089097   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.089199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.589383   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.589475   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.589880   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:12.589952   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:13.089681   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.089750   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:13.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.590299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.589280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:15.088982   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:15.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:15.589628   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.589698   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.590008   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.089799   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.090158   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.588891   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.588824   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:17.589312   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:18.088965   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.089050   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:18.589104   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.589181   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.589539   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.089070   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.089333   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.589020   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:19.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:20.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.089400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:20.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.589230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.589528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.089246   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.089319   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.089743   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.589336   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.589427   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.589837   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:21.589900   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:22.089716   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.089803   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.090099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:22.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.589969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.590315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.589157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:24.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.089010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.089362   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:24.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:24.589099   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.589172   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.589544   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.089055   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.089127   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.089434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.588952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:26.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.089020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:26.089524   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:26.588879   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.088972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.089314   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.589053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.589130   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.589456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.088844   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.089168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:28.589390   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:29.088940   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.089009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:29.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.589072   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.589384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.089095   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.089945   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.589738   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.590195   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:30.590251   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:31.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.089111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.089438   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:31.588985   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.589056   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.088915   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.588989   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.589324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:33.088946   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.089384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:33.089440   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:33.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.589343   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.089030   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.589373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:35.089090   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.089168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:35.089625   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:35.588821   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.589161   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.088971   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.089321   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.589745   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.589817   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.590097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:37.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.089699   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.089969   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:37.090012   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:37.589546   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.589637   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.589963   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.089733   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.089804   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.090142   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.589803   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.589876   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.088981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.089329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.589036   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.589107   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:39.589515   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:40.088909   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.089345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:40.589047   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.589120   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.089564   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.089897   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.589633   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.589911   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:41.589956   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:42.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.089280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.588990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.588921   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:44.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.089337   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:44.089388   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:44.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.589923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.590187   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.089403   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.589135   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.589226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.589637   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.088870   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.089279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.589345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:46.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:47.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:47.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.589110   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.589189   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.589550   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:48.589608   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:49.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:49.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.588965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.589274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.588817   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.588886   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.589146   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:51.089119   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.089223   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.089571   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:51.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:51.589291   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.089674   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.090013   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.589771   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.589847   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:53.089897   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.089975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.090297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:53.090359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:53.589788   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.589869   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.590118   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.088938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.089272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.588948   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.588997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:55.589383   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:56.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.089147   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.089578   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:56.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.589253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.088920   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:58.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.089773   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.090032   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:58.090073   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:58.589805   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.589877   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.088885   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.088954   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.089285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.589709   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.589783   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.590045   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:00.089976   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.090062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.090455   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:00.090523   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:00.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.589022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.089193   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.089258   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.089567   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.589248   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.589320   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.589696   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.089617   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.089689   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.090033   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.589742   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.589809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.590065   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:02.590107   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:03.089840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.089919   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.090274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:03.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.588964   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.088940   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.089202   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:05.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:05.089397   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:05.589817   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.589881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.590139   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.088913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.089226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.089268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.589804   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:07.590283   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:08.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.089427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:08.588851   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.088893   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.588982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:10.088978   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.089059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.089383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:10.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.589086   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.589443   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.089293   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.089375   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.089754   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.588862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:12.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:12.089477   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:12.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.088863   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.089236   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.589400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.089062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.589792   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.589864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.590157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:14.590205   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:15.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.089998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.090393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:15.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.089755   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.089823   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.090149   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:17.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:17.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:17.589041   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.589117   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.089353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.589024   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.589103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.089036   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.589006   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:19.589433   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:20.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.089045   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:20.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.589892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.590189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.089147   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.089226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:22.089347   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.089422   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.089710   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:22.089757   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:22.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.589640   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.589978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.089774   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.090209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.589840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.589913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.590166   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.089300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.589281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:24.589334   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:25.089831   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.089896   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:25.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.589601   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.589668   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.589943   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:26.589982   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:27.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.088939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:27.588870   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.588951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.089205   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.589061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.589381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:29.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:29.089377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:29.588973   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.088974   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.089053   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.089429   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.589066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:31.089200   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.089577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:31.089637   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:31.588904   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.089340   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.089680   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.589442   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.589524   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.589781   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:33.089603   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.089675   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.089988   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:33.090052   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:33.589773   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.590174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.089801   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.090171   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.589294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.089011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.589740   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.589810   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.590064   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:35.590105   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:36.089859   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.089929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.090255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:36.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.589001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.089192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:38.088964   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:38.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:38.588876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.588953   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.589211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.089039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.089017   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.089088   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.589633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.589707   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.590026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:40.590081   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:41.089028   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:41.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.589178   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.589434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.089390   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.089474   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.089854   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:42.590148   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:43.089748   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.089825   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.090133   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:43.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.589249   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.588882   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.589201   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:45.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.089330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:45.089382   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:45.589179   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.589564   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.089258   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.089345   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.089648   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.589361   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.589436   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.589775   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:47.089602   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.089682   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.090003   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:47.090068   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:47.589765   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.088918   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.089233   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.589032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.089114   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.089186   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.089669   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.589463   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.589550   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.589841   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:49.589889   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:50.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.089706   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.090067   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:50.589698   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.589782   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.590096   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.089244   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:52.088879   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:52.089255   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:52.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.589094   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.589168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.589423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:54.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:54.089441   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:54.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.088999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.089276   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.589378   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:56.088961   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.089405   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:56.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:56.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.588950   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.089074   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.589339   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.089278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.589453   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:58.589546   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:59.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:59.588990   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.589367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.089002   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.089412   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.589613   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.589705   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:00.590166   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:01.088830   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.089237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:01.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.089351   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.089432   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.089784   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.589531   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.589609   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.589892   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:03.089722   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:03.090212   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:03.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.589338   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.089007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.588999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.089026   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.089164   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.089649   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.589145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.589411   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:05.589452   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:06.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.089008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:06.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.089795   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.089860   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.090124   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.588839   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.589229   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:08.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:08.089432   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:08.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.088948   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.589158   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.589644   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:10.089588   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.089666   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.090026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:10.090112   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:10.588810   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.589228   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.089103   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.089180   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.089540   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:12.088890   48804 type.go:168] "Request Body" body=""
	I1201 19:32:12.089251   48804 node_ready.go:38] duration metric: took 6m0.000540563s for node "functional-428744" to be "Ready" ...
	I1201 19:32:12.092425   48804 out.go:203] 
	W1201 19:32:12.095253   48804 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1201 19:32:12.095277   48804 out.go:285] * 
	W1201 19:32:12.097463   48804 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:32:12.100606   48804 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:19 functional-428744 containerd[5833]: time="2025-12-01T19:32:19.453990081Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.487184870Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.489388253Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.502137516Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.502566149Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.487479497Z" level=info msg="No images store for sha256:9a78e6e24df19d4b5ee9819f74178ce844a778e46ad5f9dc53101feb167831e4"
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.490112047Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-428744\""
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.497324365Z" level=info msg="ImageCreate event name:\"sha256:4f3a5d641d9b7a5007231441eda3adf17b6874d8b72429dc7a44618c67a293d6\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.497866456Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-428744\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.289150852Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.291773015Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.293690572Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.305668049Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.267444727Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.269891231Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.271821161Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.292828735Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.400987063Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.403215357Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.410469710Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.411816327Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.523705796Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.526083600Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.533210046Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.533572556Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:32:25.362205    9820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:25.363095    9820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:25.364231    9820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:25.364902    9820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:25.365927    9820 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:32:25 up  1:14,  0 user,  load average: 0.42, 0.33, 0.59
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:32:22 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:22 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 823.
	Dec 01 19:32:22 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:22 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:22 functional-428744 kubelet[9605]: E1201 19:32:22.887312    9605 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:22 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:22 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:23 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 824.
	Dec 01 19:32:23 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:23 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:23 functional-428744 kubelet[9694]: E1201 19:32:23.585106    9694 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:23 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:23 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:24 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 825.
	Dec 01 19:32:24 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:24 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:24 functional-428744 kubelet[9729]: E1201 19:32:24.397143    9729 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:24 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:24 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 826.
	Dec 01 19:32:25 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:25 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:25 functional-428744 kubelet[9768]: E1201 19:32:25.165328    9768 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (365.77505ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd (2.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly (2.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-428744 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-428744 get pods: exit status 1 (108.867336ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-428744 get pods": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (315.457211ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-019259 image ls --format yaml --alsologtostderr                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format short --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format json --alsologtostderr                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format table --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh     │ functional-019259 ssh pgrep buildkitd                                                                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image   │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                  │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls                                                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete  │ -p functional-019259                                                                                                                                    │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start   │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ start   │ -p functional-428744 --alsologtostderr -v=8                                                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:26 UTC │                     │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:latest                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add minikube-local-cache-test:functional-428744                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache delete minikube-local-cache-test:functional-428744                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl images                                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ cache   │ functional-428744 cache reload                                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ kubectl │ functional-428744 kubectl -- --context functional-428744 get pods                                                                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:26:06
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:26:06.760311   48804 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:26:06.760471   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760480   48804 out.go:374] Setting ErrFile to fd 2...
	I1201 19:26:06.760485   48804 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:26:06.760749   48804 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:26:06.761114   48804 out.go:368] Setting JSON to false
	I1201 19:26:06.761974   48804 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4118,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:26:06.762048   48804 start.go:143] virtualization:  
	I1201 19:26:06.765446   48804 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:26:06.769259   48804 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:26:06.769379   48804 notify.go:221] Checking for updates...
	I1201 19:26:06.775400   48804 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:26:06.778339   48804 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:06.781100   48804 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:26:06.784047   48804 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:26:06.786945   48804 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:26:06.790355   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:06.790504   48804 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:26:06.817889   48804 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:26:06.818002   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.874928   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.865437959 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.875040   48804 docker.go:319] overlay module found
	I1201 19:26:06.878298   48804 out.go:179] * Using the docker driver based on existing profile
	I1201 19:26:06.881322   48804 start.go:309] selected driver: docker
	I1201 19:26:06.881345   48804 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.881455   48804 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:26:06.881703   48804 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:26:06.946129   48804 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:26:06.93658681 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:26:06.946541   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:06.946612   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:06.946692   48804 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:06.949952   48804 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:26:06.952666   48804 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:26:06.955511   48804 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:26:06.958482   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:06.958560   48804 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:26:06.978189   48804 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:26:06.978215   48804 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:26:07.013576   48804 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:26:07.245550   48804 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:26:07.245729   48804 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:26:07.245814   48804 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245902   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:26:07.245911   48804 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 111.155µs
	I1201 19:26:07.245925   48804 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:26:07.245935   48804 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.245965   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:26:07.245971   48804 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 37.068µs
	I1201 19:26:07.245977   48804 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:26:07.245979   48804 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:26:07.245986   48804 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246018   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:26:07.246022   48804 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 37.371µs
	I1201 19:26:07.246020   48804 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246029   48804 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246041   48804 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246063   48804 start.go:364] duration metric: took 29.397µs to acquireMachinesLock for "functional-428744"
	I1201 19:26:07.246076   48804 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:26:07.246081   48804 fix.go:54] fixHost starting: 
	I1201 19:26:07.246083   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:26:07.246089   48804 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 51.212µs
	I1201 19:26:07.246094   48804 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246103   48804 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246129   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:26:07.246135   48804 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 32.744µs
	I1201 19:26:07.246145   48804 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:26:07.246154   48804 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246179   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:26:07.246184   48804 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.013µs
	I1201 19:26:07.246189   48804 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:26:07.246197   48804 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246221   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:26:07.246225   48804 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 29.356µs
	I1201 19:26:07.246230   48804 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:26:07.246238   48804 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:26:07.246268   48804 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:26:07.246273   48804 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 35.526µs
	I1201 19:26:07.246278   48804 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:26:07.246288   48804 cache.go:87] Successfully saved all images to host disk.
	I1201 19:26:07.246352   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:07.263626   48804 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:26:07.263658   48804 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:26:07.267042   48804 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:26:07.267094   48804 machine.go:94] provisionDockerMachine start ...
	I1201 19:26:07.267191   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.284298   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.284633   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.284647   48804 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:26:07.445599   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.445668   48804 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:26:07.445742   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.466448   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.466762   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.466780   48804 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:26:07.626795   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:26:07.626872   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.646204   48804 main.go:143] libmachine: Using SSH client type: native
	I1201 19:26:07.646540   48804 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:26:07.646566   48804 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:26:07.797736   48804 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:26:07.797765   48804 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:26:07.797791   48804 ubuntu.go:190] setting up certificates
	I1201 19:26:07.797801   48804 provision.go:84] configureAuth start
	I1201 19:26:07.797871   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:07.815670   48804 provision.go:143] copyHostCerts
	I1201 19:26:07.815726   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815768   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:26:07.815790   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:26:07.815876   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:26:07.815970   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.815990   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:26:07.815998   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:26:07.816026   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:26:07.816080   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816100   48804 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:26:07.816107   48804 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:26:07.816131   48804 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:26:07.816190   48804 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:26:07.904001   48804 provision.go:177] copyRemoteCerts
	I1201 19:26:07.904069   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:26:07.904109   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:07.922469   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.029518   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1201 19:26:08.029579   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:26:08.047419   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1201 19:26:08.047495   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:26:08.069296   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1201 19:26:08.069377   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:26:08.088982   48804 provision.go:87] duration metric: took 291.155414ms to configureAuth
	I1201 19:26:08.089064   48804 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:26:08.089321   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:08.089350   48804 machine.go:97] duration metric: took 822.24428ms to provisionDockerMachine
	I1201 19:26:08.089385   48804 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:26:08.089416   48804 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:26:08.089542   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:26:08.089633   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.112132   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.217325   48804 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:26:08.220778   48804 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1201 19:26:08.220802   48804 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1201 19:26:08.220808   48804 command_runner.go:130] > VERSION_ID="12"
	I1201 19:26:08.220813   48804 command_runner.go:130] > VERSION="12 (bookworm)"
	I1201 19:26:08.220817   48804 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1201 19:26:08.220820   48804 command_runner.go:130] > ID=debian
	I1201 19:26:08.220825   48804 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1201 19:26:08.220831   48804 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1201 19:26:08.220837   48804 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1201 19:26:08.220885   48804 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:26:08.220907   48804 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:26:08.220919   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:26:08.220978   48804 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:26:08.221055   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:26:08.221066   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /etc/ssl/certs/43052.pem
	I1201 19:26:08.221140   48804 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:26:08.221148   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> /etc/test/nested/copy/4305/hosts
	I1201 19:26:08.221198   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:26:08.229002   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:08.246695   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:26:08.263789   48804 start.go:296] duration metric: took 174.371826ms for postStartSetup
	I1201 19:26:08.263869   48804 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:26:08.263931   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.281235   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.382466   48804 command_runner.go:130] > 12%
	I1201 19:26:08.382557   48804 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:26:08.386763   48804 command_runner.go:130] > 172G
	I1201 19:26:08.387182   48804 fix.go:56] duration metric: took 1.141096136s for fixHost
	I1201 19:26:08.387210   48804 start.go:83] releasing machines lock for "functional-428744", held for 1.141138241s
	I1201 19:26:08.387280   48804 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:26:08.405649   48804 ssh_runner.go:195] Run: cat /version.json
	I1201 19:26:08.405673   48804 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:26:08.405720   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.405736   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:08.424898   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.435929   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:08.615638   48804 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1201 19:26:08.615700   48804 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1201 19:26:08.615817   48804 ssh_runner.go:195] Run: systemctl --version
	I1201 19:26:08.621830   48804 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1201 19:26:08.621881   48804 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1201 19:26:08.622279   48804 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1201 19:26:08.626405   48804 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1201 19:26:08.626689   48804 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:26:08.626779   48804 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:26:08.634801   48804 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:26:08.634864   48804 start.go:496] detecting cgroup driver to use...
	I1201 19:26:08.634909   48804 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:26:08.634995   48804 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:26:08.650643   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:26:08.663900   48804 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:26:08.663962   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:26:08.680016   48804 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:26:08.693295   48804 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:26:08.807192   48804 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:26:08.949829   48804 docker.go:234] disabling docker service ...
	I1201 19:26:08.949910   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:26:08.965005   48804 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:26:08.978389   48804 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:26:09.113220   48804 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:26:09.265765   48804 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:26:09.280775   48804 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:26:09.295503   48804 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1201 19:26:09.296833   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:26:09.307263   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:26:09.316009   48804 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:26:09.316129   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:26:09.324849   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.333586   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:26:09.341989   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:26:09.350174   48804 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:26:09.358089   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:26:09.366694   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:26:09.375459   48804 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:26:09.384162   48804 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:26:09.390646   48804 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1201 19:26:09.391441   48804 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:26:09.398673   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:09.519779   48804 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:26:09.650665   48804 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:26:09.650790   48804 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:26:09.655039   48804 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1201 19:26:09.655139   48804 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1201 19:26:09.655166   48804 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1201 19:26:09.655199   48804 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:09.655222   48804 command_runner.go:130] > Access: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655243   48804 command_runner.go:130] > Modify: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655266   48804 command_runner.go:130] > Change: 2025-12-01 19:26:09.616392816 +0000
	I1201 19:26:09.655294   48804 command_runner.go:130] >  Birth: -
	I1201 19:26:09.655330   48804 start.go:564] Will wait 60s for crictl version
	I1201 19:26:09.655409   48804 ssh_runner.go:195] Run: which crictl
	I1201 19:26:09.659043   48804 command_runner.go:130] > /usr/local/bin/crictl
	I1201 19:26:09.659221   48804 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:26:09.684907   48804 command_runner.go:130] > Version:  0.1.0
	I1201 19:26:09.684979   48804 command_runner.go:130] > RuntimeName:  containerd
	I1201 19:26:09.684999   48804 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1201 19:26:09.685021   48804 command_runner.go:130] > RuntimeApiVersion:  v1
	I1201 19:26:09.687516   48804 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:26:09.687623   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.708580   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.710309   48804 ssh_runner.go:195] Run: containerd --version
	I1201 19:26:09.728879   48804 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1201 19:26:09.737012   48804 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:26:09.739912   48804 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:26:09.756533   48804 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:26:09.760816   48804 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1201 19:26:09.760978   48804 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:26:09.761088   48804 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:26:09.761147   48804 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:26:09.788491   48804 command_runner.go:130] > {
	I1201 19:26:09.788509   48804 command_runner.go:130] >   "images":  [
	I1201 19:26:09.788514   48804 command_runner.go:130] >     {
	I1201 19:26:09.788524   48804 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1201 19:26:09.788529   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788534   48804 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1201 19:26:09.788538   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788542   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788546   48804 command_runner.go:130] >       "size":  "8032639",
	I1201 19:26:09.788552   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788556   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788560   48804 command_runner.go:130] >     },
	I1201 19:26:09.788563   48804 command_runner.go:130] >     {
	I1201 19:26:09.788570   48804 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1201 19:26:09.788573   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788578   48804 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1201 19:26:09.788582   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788586   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788598   48804 command_runner.go:130] >       "size":  "21166088",
	I1201 19:26:09.788603   48804 command_runner.go:130] >       "username":  "nonroot",
	I1201 19:26:09.788611   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788615   48804 command_runner.go:130] >     },
	I1201 19:26:09.788617   48804 command_runner.go:130] >     {
	I1201 19:26:09.788624   48804 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1201 19:26:09.788628   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788633   48804 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1201 19:26:09.788636   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788639   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788643   48804 command_runner.go:130] >       "size":  "21134420",
	I1201 19:26:09.788647   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788651   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788654   48804 command_runner.go:130] >       },
	I1201 19:26:09.788658   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788662   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788665   48804 command_runner.go:130] >     },
	I1201 19:26:09.788668   48804 command_runner.go:130] >     {
	I1201 19:26:09.788675   48804 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1201 19:26:09.788678   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788685   48804 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1201 19:26:09.788689   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788692   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788697   48804 command_runner.go:130] >       "size":  "24676285",
	I1201 19:26:09.788700   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788704   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788707   48804 command_runner.go:130] >       },
	I1201 19:26:09.788711   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788715   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788718   48804 command_runner.go:130] >     },
	I1201 19:26:09.788721   48804 command_runner.go:130] >     {
	I1201 19:26:09.788728   48804 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1201 19:26:09.788732   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788739   48804 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1201 19:26:09.788743   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788750   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788755   48804 command_runner.go:130] >       "size":  "20658969",
	I1201 19:26:09.788759   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788762   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788765   48804 command_runner.go:130] >       },
	I1201 19:26:09.788769   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788773   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788776   48804 command_runner.go:130] >     },
	I1201 19:26:09.788779   48804 command_runner.go:130] >     {
	I1201 19:26:09.788786   48804 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1201 19:26:09.788790   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788795   48804 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1201 19:26:09.788799   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788803   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788807   48804 command_runner.go:130] >       "size":  "22428165",
	I1201 19:26:09.788814   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788818   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788822   48804 command_runner.go:130] >     },
	I1201 19:26:09.788825   48804 command_runner.go:130] >     {
	I1201 19:26:09.788832   48804 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1201 19:26:09.788835   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788841   48804 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1201 19:26:09.788844   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788855   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788860   48804 command_runner.go:130] >       "size":  "15389290",
	I1201 19:26:09.788863   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788867   48804 command_runner.go:130] >         "value":  "0"
	I1201 19:26:09.788870   48804 command_runner.go:130] >       },
	I1201 19:26:09.788874   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788878   48804 command_runner.go:130] >       "pinned":  false
	I1201 19:26:09.788881   48804 command_runner.go:130] >     },
	I1201 19:26:09.788883   48804 command_runner.go:130] >     {
	I1201 19:26:09.788890   48804 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1201 19:26:09.788897   48804 command_runner.go:130] >       "repoTags":  [
	I1201 19:26:09.788902   48804 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1201 19:26:09.788905   48804 command_runner.go:130] >       ],
	I1201 19:26:09.788908   48804 command_runner.go:130] >       "repoDigests":  [],
	I1201 19:26:09.788912   48804 command_runner.go:130] >       "size":  "265458",
	I1201 19:26:09.788920   48804 command_runner.go:130] >       "uid":  {
	I1201 19:26:09.788924   48804 command_runner.go:130] >         "value":  "65535"
	I1201 19:26:09.788927   48804 command_runner.go:130] >       },
	I1201 19:26:09.788931   48804 command_runner.go:130] >       "username":  "",
	I1201 19:26:09.788934   48804 command_runner.go:130] >       "pinned":  true
	I1201 19:26:09.788937   48804 command_runner.go:130] >     }
	I1201 19:26:09.788940   48804 command_runner.go:130] >   ]
	I1201 19:26:09.788943   48804 command_runner.go:130] > }
	I1201 19:26:09.791239   48804 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:26:09.791264   48804 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:26:09.791273   48804 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:26:09.791374   48804 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:26:09.791446   48804 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:26:09.822661   48804 command_runner.go:130] > {
	I1201 19:26:09.822679   48804 command_runner.go:130] >   "cniconfig": {
	I1201 19:26:09.822684   48804 command_runner.go:130] >     "Networks": [
	I1201 19:26:09.822688   48804 command_runner.go:130] >       {
	I1201 19:26:09.822694   48804 command_runner.go:130] >         "Config": {
	I1201 19:26:09.822699   48804 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1201 19:26:09.822704   48804 command_runner.go:130] >           "Name": "cni-loopback",
	I1201 19:26:09.822709   48804 command_runner.go:130] >           "Plugins": [
	I1201 19:26:09.822712   48804 command_runner.go:130] >             {
	I1201 19:26:09.822717   48804 command_runner.go:130] >               "Network": {
	I1201 19:26:09.822721   48804 command_runner.go:130] >                 "ipam": {},
	I1201 19:26:09.822726   48804 command_runner.go:130] >                 "type": "loopback"
	I1201 19:26:09.822730   48804 command_runner.go:130] >               },
	I1201 19:26:09.822735   48804 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1201 19:26:09.822738   48804 command_runner.go:130] >             }
	I1201 19:26:09.822741   48804 command_runner.go:130] >           ],
	I1201 19:26:09.822751   48804 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1201 19:26:09.822755   48804 command_runner.go:130] >         },
	I1201 19:26:09.822760   48804 command_runner.go:130] >         "IFName": "lo"
	I1201 19:26:09.822764   48804 command_runner.go:130] >       }
	I1201 19:26:09.822771   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822776   48804 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1201 19:26:09.822780   48804 command_runner.go:130] >     "PluginDirs": [
	I1201 19:26:09.822784   48804 command_runner.go:130] >       "/opt/cni/bin"
	I1201 19:26:09.822787   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822792   48804 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1201 19:26:09.822795   48804 command_runner.go:130] >     "Prefix": "eth"
	I1201 19:26:09.822798   48804 command_runner.go:130] >   },
	I1201 19:26:09.822801   48804 command_runner.go:130] >   "config": {
	I1201 19:26:09.822805   48804 command_runner.go:130] >     "cdiSpecDirs": [
	I1201 19:26:09.822809   48804 command_runner.go:130] >       "/etc/cdi",
	I1201 19:26:09.822813   48804 command_runner.go:130] >       "/var/run/cdi"
	I1201 19:26:09.822816   48804 command_runner.go:130] >     ],
	I1201 19:26:09.822823   48804 command_runner.go:130] >     "cni": {
	I1201 19:26:09.822827   48804 command_runner.go:130] >       "binDir": "",
	I1201 19:26:09.822831   48804 command_runner.go:130] >       "binDirs": [
	I1201 19:26:09.822834   48804 command_runner.go:130] >         "/opt/cni/bin"
	I1201 19:26:09.822837   48804 command_runner.go:130] >       ],
	I1201 19:26:09.822842   48804 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1201 19:26:09.822846   48804 command_runner.go:130] >       "confTemplate": "",
	I1201 19:26:09.822849   48804 command_runner.go:130] >       "ipPref": "",
	I1201 19:26:09.822853   48804 command_runner.go:130] >       "maxConfNum": 1,
	I1201 19:26:09.822857   48804 command_runner.go:130] >       "setupSerially": false,
	I1201 19:26:09.822862   48804 command_runner.go:130] >       "useInternalLoopback": false
	I1201 19:26:09.822865   48804 command_runner.go:130] >     },
	I1201 19:26:09.822872   48804 command_runner.go:130] >     "containerd": {
	I1201 19:26:09.822876   48804 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1201 19:26:09.822881   48804 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1201 19:26:09.822886   48804 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1201 19:26:09.822892   48804 command_runner.go:130] >       "runtimes": {
	I1201 19:26:09.822896   48804 command_runner.go:130] >         "runc": {
	I1201 19:26:09.822901   48804 command_runner.go:130] >           "ContainerAnnotations": null,
	I1201 19:26:09.822905   48804 command_runner.go:130] >           "PodAnnotations": null,
	I1201 19:26:09.822914   48804 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1201 19:26:09.822919   48804 command_runner.go:130] >           "cgroupWritable": false,
	I1201 19:26:09.822923   48804 command_runner.go:130] >           "cniConfDir": "",
	I1201 19:26:09.822927   48804 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1201 19:26:09.822931   48804 command_runner.go:130] >           "io_type": "",
	I1201 19:26:09.822934   48804 command_runner.go:130] >           "options": {
	I1201 19:26:09.822939   48804 command_runner.go:130] >             "BinaryName": "",
	I1201 19:26:09.822943   48804 command_runner.go:130] >             "CriuImagePath": "",
	I1201 19:26:09.822947   48804 command_runner.go:130] >             "CriuWorkPath": "",
	I1201 19:26:09.822951   48804 command_runner.go:130] >             "IoGid": 0,
	I1201 19:26:09.822955   48804 command_runner.go:130] >             "IoUid": 0,
	I1201 19:26:09.822959   48804 command_runner.go:130] >             "NoNewKeyring": false,
	I1201 19:26:09.822963   48804 command_runner.go:130] >             "Root": "",
	I1201 19:26:09.822968   48804 command_runner.go:130] >             "ShimCgroup": "",
	I1201 19:26:09.822972   48804 command_runner.go:130] >             "SystemdCgroup": false
	I1201 19:26:09.822975   48804 command_runner.go:130] >           },
	I1201 19:26:09.822980   48804 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1201 19:26:09.822987   48804 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1201 19:26:09.822991   48804 command_runner.go:130] >           "runtimePath": "",
	I1201 19:26:09.822996   48804 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1201 19:26:09.823001   48804 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1201 19:26:09.823005   48804 command_runner.go:130] >           "snapshotter": ""
	I1201 19:26:09.823008   48804 command_runner.go:130] >         }
	I1201 19:26:09.823011   48804 command_runner.go:130] >       }
	I1201 19:26:09.823014   48804 command_runner.go:130] >     },
	I1201 19:26:09.823026   48804 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1201 19:26:09.823032   48804 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1201 19:26:09.823037   48804 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1201 19:26:09.823041   48804 command_runner.go:130] >     "disableApparmor": false,
	I1201 19:26:09.823045   48804 command_runner.go:130] >     "disableHugetlbController": true,
	I1201 19:26:09.823049   48804 command_runner.go:130] >     "disableProcMount": false,
	I1201 19:26:09.823054   48804 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1201 19:26:09.823058   48804 command_runner.go:130] >     "enableCDI": true,
	I1201 19:26:09.823068   48804 command_runner.go:130] >     "enableSelinux": false,
	I1201 19:26:09.823073   48804 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1201 19:26:09.823078   48804 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1201 19:26:09.823091   48804 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1201 19:26:09.823096   48804 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1201 19:26:09.823100   48804 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1201 19:26:09.823105   48804 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1201 19:26:09.823109   48804 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1201 19:26:09.823115   48804 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823119   48804 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1201 19:26:09.823125   48804 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1201 19:26:09.823129   48804 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1201 19:26:09.823135   48804 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1201 19:26:09.823138   48804 command_runner.go:130] >   },
	I1201 19:26:09.823141   48804 command_runner.go:130] >   "features": {
	I1201 19:26:09.823145   48804 command_runner.go:130] >     "supplemental_groups_policy": true
	I1201 19:26:09.823148   48804 command_runner.go:130] >   },
	I1201 19:26:09.823152   48804 command_runner.go:130] >   "golang": "go1.24.9",
	I1201 19:26:09.823162   48804 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823173   48804 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1201 19:26:09.823176   48804 command_runner.go:130] >   "runtimeHandlers": [
	I1201 19:26:09.823179   48804 command_runner.go:130] >     {
	I1201 19:26:09.823183   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823188   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823194   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823197   48804 command_runner.go:130] >       }
	I1201 19:26:09.823199   48804 command_runner.go:130] >     },
	I1201 19:26:09.823202   48804 command_runner.go:130] >     {
	I1201 19:26:09.823206   48804 command_runner.go:130] >       "features": {
	I1201 19:26:09.823211   48804 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1201 19:26:09.823215   48804 command_runner.go:130] >         "user_namespaces": true
	I1201 19:26:09.823218   48804 command_runner.go:130] >       },
	I1201 19:26:09.823221   48804 command_runner.go:130] >       "name": "runc"
	I1201 19:26:09.823228   48804 command_runner.go:130] >     }
	I1201 19:26:09.823231   48804 command_runner.go:130] >   ],
	I1201 19:26:09.823235   48804 command_runner.go:130] >   "status": {
	I1201 19:26:09.823239   48804 command_runner.go:130] >     "conditions": [
	I1201 19:26:09.823242   48804 command_runner.go:130] >       {
	I1201 19:26:09.823245   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823249   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823252   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823257   48804 command_runner.go:130] >         "type": "RuntimeReady"
	I1201 19:26:09.823260   48804 command_runner.go:130] >       },
	I1201 19:26:09.823263   48804 command_runner.go:130] >       {
	I1201 19:26:09.823269   48804 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1201 19:26:09.823274   48804 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1201 19:26:09.823277   48804 command_runner.go:130] >         "status": false,
	I1201 19:26:09.823282   48804 command_runner.go:130] >         "type": "NetworkReady"
	I1201 19:26:09.823285   48804 command_runner.go:130] >       },
	I1201 19:26:09.823288   48804 command_runner.go:130] >       {
	I1201 19:26:09.823292   48804 command_runner.go:130] >         "message": "",
	I1201 19:26:09.823295   48804 command_runner.go:130] >         "reason": "",
	I1201 19:26:09.823299   48804 command_runner.go:130] >         "status": true,
	I1201 19:26:09.823305   48804 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1201 19:26:09.823308   48804 command_runner.go:130] >       }
	I1201 19:26:09.823310   48804 command_runner.go:130] >     ]
	I1201 19:26:09.823313   48804 command_runner.go:130] >   }
	I1201 19:26:09.823316   48804 command_runner.go:130] > }
	I1201 19:26:09.824829   48804 cni.go:84] Creating CNI manager for ""
	I1201 19:26:09.824854   48804 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:26:09.824874   48804 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:26:09.824897   48804 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:26:09.825029   48804 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:26:09.825110   48804 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:26:09.833035   48804 command_runner.go:130] > kubeadm
	I1201 19:26:09.833056   48804 command_runner.go:130] > kubectl
	I1201 19:26:09.833061   48804 command_runner.go:130] > kubelet
	I1201 19:26:09.833076   48804 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:26:09.833134   48804 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:26:09.840788   48804 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:26:09.853581   48804 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:26:09.866488   48804 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1201 19:26:09.879364   48804 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:26:09.883102   48804 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1201 19:26:09.883255   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:10.007542   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:10.337813   48804 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:26:10.337836   48804 certs.go:195] generating shared ca certs ...
	I1201 19:26:10.337853   48804 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:10.338014   48804 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:26:10.338073   48804 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:26:10.338085   48804 certs.go:257] generating profile certs ...
	I1201 19:26:10.338185   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:26:10.338247   48804 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:26:10.338297   48804 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:26:10.338309   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1201 19:26:10.338322   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1201 19:26:10.338339   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1201 19:26:10.338351   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1201 19:26:10.338365   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1201 19:26:10.338377   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1201 19:26:10.338392   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1201 19:26:10.338406   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1201 19:26:10.338461   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:26:10.338495   48804 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:26:10.338507   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:26:10.338544   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:26:10.338574   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:26:10.338602   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:26:10.338653   48804 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:26:10.338691   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.338709   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.338720   48804 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem -> /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.339292   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:26:10.367504   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:26:10.391051   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:26:10.410924   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:26:10.429158   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:26:10.447137   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:26:10.464077   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:26:10.481473   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:26:10.498763   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:26:10.516542   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:26:10.534712   48804 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:26:10.552802   48804 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:26:10.565633   48804 ssh_runner.go:195] Run: openssl version
	I1201 19:26:10.571657   48804 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1201 19:26:10.572092   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:26:10.580812   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584562   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584589   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.584650   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:26:10.625269   48804 command_runner.go:130] > 3ec20f2e
	I1201 19:26:10.625746   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:26:10.633767   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:26:10.642160   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.645995   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646248   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.646315   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:26:10.686937   48804 command_runner.go:130] > b5213941
	I1201 19:26:10.687439   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:26:10.695499   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:26:10.704517   48804 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708133   48804 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708431   48804 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.708519   48804 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:26:10.749422   48804 command_runner.go:130] > 51391683
	I1201 19:26:10.749951   48804 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:26:10.758524   48804 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762526   48804 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:26:10.762565   48804 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1201 19:26:10.762572   48804 command_runner.go:130] > Device: 259,1	Inode: 1053621     Links: 1
	I1201 19:26:10.762579   48804 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1201 19:26:10.762585   48804 command_runner.go:130] > Access: 2025-12-01 19:22:03.818228473 +0000
	I1201 19:26:10.762590   48804 command_runner.go:130] > Modify: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762599   48804 command_runner.go:130] > Change: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762604   48804 command_runner.go:130] >  Birth: 2025-12-01 19:17:59.714758067 +0000
	I1201 19:26:10.762682   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:26:10.803623   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.804107   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:26:10.845983   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.846486   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:26:10.887221   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.887637   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:26:10.928253   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.928695   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:26:10.970677   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:10.971198   48804 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:26:11.012420   48804 command_runner.go:130] > Certificate will not expire
	I1201 19:26:11.012544   48804 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:26:11.012658   48804 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:26:11.012733   48804 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:26:11.044110   48804 cri.go:89] found id: ""
	I1201 19:26:11.044177   48804 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:26:11.054430   48804 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1201 19:26:11.054508   48804 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1201 19:26:11.054530   48804 command_runner.go:130] > /var/lib/minikube/etcd:
	I1201 19:26:11.054631   48804 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:26:11.054642   48804 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:26:11.054719   48804 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:26:11.063470   48804 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:26:11.063923   48804 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-428744" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.064051   48804 kubeconfig.go:62] /home/jenkins/minikube-integration/21997-2497/kubeconfig needs updating (will repair): [kubeconfig missing "functional-428744" cluster setting kubeconfig missing "functional-428744" context setting]
	I1201 19:26:11.064410   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.064918   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.065081   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.065855   48804 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1201 19:26:11.065877   48804 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1201 19:26:11.065883   48804 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1201 19:26:11.065889   48804 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1201 19:26:11.065893   48804 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1201 19:26:11.065945   48804 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1201 19:26:11.066161   48804 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:26:11.074525   48804 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1201 19:26:11.074603   48804 kubeadm.go:602] duration metric: took 19.955614ms to restartPrimaryControlPlane
	I1201 19:26:11.074623   48804 kubeadm.go:403] duration metric: took 62.08191ms to StartCluster
	I1201 19:26:11.074644   48804 settings.go:142] acquiring lock: {Name:mk0c68be267fd1e06eeb79721201896d000b433c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.074712   48804 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.075396   48804 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:26:11.075623   48804 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 19:26:11.076036   48804 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:26:11.076070   48804 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1201 19:26:11.076207   48804 addons.go:70] Setting storage-provisioner=true in profile "functional-428744"
	I1201 19:26:11.076225   48804 addons.go:239] Setting addon storage-provisioner=true in "functional-428744"
	I1201 19:26:11.076239   48804 addons.go:70] Setting default-storageclass=true in profile "functional-428744"
	I1201 19:26:11.076254   48804 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-428744"
	I1201 19:26:11.076255   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.076600   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.076785   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.081245   48804 out.go:179] * Verifying Kubernetes components...
	I1201 19:26:11.087150   48804 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:26:11.117851   48804 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 19:26:11.119516   48804 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:26:11.119671   48804 kapi.go:59] client config for functional-428744: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 19:26:11.120991   48804 addons.go:239] Setting addon default-storageclass=true in "functional-428744"
	I1201 19:26:11.121044   48804 host.go:66] Checking if "functional-428744" exists ...
	I1201 19:26:11.121546   48804 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:26:11.121741   48804 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.121759   48804 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1201 19:26:11.121797   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.157953   48804 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:11.157978   48804 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1201 19:26:11.158049   48804 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:26:11.182138   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.197665   48804 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:26:11.313464   48804 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:26:11.333888   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:11.351804   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.088419   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088456   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088499   48804 retry.go:31] will retry after 370.622111ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088535   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.088549   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088556   48804 retry.go:31] will retry after 214.864091ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.088649   48804 node_ready.go:35] waiting up to 6m0s for node "functional-428744" to be "Ready" ...
	I1201 19:26:12.088787   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.088873   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.089197   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.304654   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.362814   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.366340   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.366413   48804 retry.go:31] will retry after 398.503688ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.459632   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:12.519830   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.523259   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.523294   48804 retry.go:31] will retry after 535.054731ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.589478   48804 type.go:168] "Request Body" body=""
	I1201 19:26:12.589570   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:12.589862   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:12.765159   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:12.827324   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:12.827370   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:12.827390   48804 retry.go:31] will retry after 739.755241ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.058728   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.089511   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.089585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.089856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.118077   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.118134   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.118154   48804 retry.go:31] will retry after 391.789828ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.510836   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:13.567332   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:13.570397   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.574026   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.574060   48804 retry.go:31] will retry after 1.18201014s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.589346   48804 type.go:168] "Request Body" body=""
	I1201 19:26:13.589417   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:13.589845   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:13.644640   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:13.644678   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:13.644695   48804 retry.go:31] will retry after 732.335964ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.089422   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.089515   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:14.089961   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:14.377221   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:14.438375   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.438421   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.438440   48804 retry.go:31] will retry after 1.236140087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.589732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:14.589826   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:14.590183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:14.756655   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:14.814049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:14.817149   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:14.817181   48804 retry.go:31] will retry after 1.12716485s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.089765   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.089856   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.090157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:15.588981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:15.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:15.675732   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:15.741410   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:15.741450   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.741469   48804 retry.go:31] will retry after 1.409201229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:15.944883   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:16.007405   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:16.007500   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.007543   48804 retry.go:31] will retry after 1.898784229s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:16.089691   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.089768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.090129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:16.090198   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:16.589482   48804 type.go:168] "Request Body" body=""
	I1201 19:26:16.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:16.589810   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.089728   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.151412   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:17.212400   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.212446   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.212468   48804 retry.go:31] will retry after 4.05952317s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.588902   48804 type.go:168] "Request Body" body=""
	I1201 19:26:17.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:17.589279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:17.906643   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:17.968049   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:17.968156   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:17.968182   48804 retry.go:31] will retry after 2.840296794s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:18.089284   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.089352   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.089631   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:18.588972   48804 type.go:168] "Request Body" body=""
	I1201 19:26:18.589046   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:18.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:18.589394   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:19.089061   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.089132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.089421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:19.588859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:19.588929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:19.589194   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.088895   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.089306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:26:20.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:20.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:20.808702   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:20.866089   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:20.869253   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:20.869291   48804 retry.go:31] will retry after 4.860979312s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.089785   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.089854   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.090172   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:21.090222   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:21.272551   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:21.327980   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:21.331648   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.331684   48804 retry.go:31] will retry after 4.891109087s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:21.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:26:21.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:21.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.089331   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.089409   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.089753   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:22.589555   48804 type.go:168] "Request Body" body=""
	I1201 19:26:22.589684   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:22.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.089701   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.089772   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.090125   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:23.589808   48804 type.go:168] "Request Body" body=""
	I1201 19:26:23.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:23.590266   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:23.590323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:24.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.089005   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.089273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:24.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:26:24.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:24.589377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.089325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:26:25.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:25.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:25.730733   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:25.787142   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:25.790610   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:25.790640   48804 retry.go:31] will retry after 7.92097549s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:26.089409   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:26.223678   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:26.278607   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:26.281989   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.282022   48804 retry.go:31] will retry after 7.531816175s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:26.589432   48804 type.go:168] "Request Body" body=""
	I1201 19:26:26.589521   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:26.589840   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.089669   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.089751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.090069   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:27.589693   48804 type.go:168] "Request Body" body=""
	I1201 19:26:27.589764   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:27.590089   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:28.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.089997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.090335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:28.090387   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:28.589510   48804 type.go:168] "Request Body" body=""
	I1201 19:26:28.589583   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:28.589844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.089683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.090056   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:29.589880   48804 type.go:168] "Request Body" body=""
	I1201 19:26:29.589968   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:29.590369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:30.109683   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.109762   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.110054   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:30.110098   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:30.589806   48804 type.go:168] "Request Body" body=""
	I1201 19:26:30.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:30.590200   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.089177   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.089252   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.089645   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:31.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:26:31.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:31.589198   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:32.589085   48804 type.go:168] "Request Body" body=""
	I1201 19:26:32.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:32.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:32.589565   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:33.089830   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.089902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.090208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:26:33.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:33.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:33.712788   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:33.771136   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.774250   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.774284   48804 retry.go:31] will retry after 5.105632097s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.814618   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:33.891338   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:33.891375   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:33.891394   48804 retry.go:31] will retry after 5.576720242s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:34.089900   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.089994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.090334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:34.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:26:34.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:34.589260   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:35.088913   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.088982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:35.089359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:35.589057   48804 type.go:168] "Request Body" body=""
	I1201 19:26:35.589129   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:35.589530   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.089310   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:36.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:26:36.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:36.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:37.089182   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.089255   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:37.089610   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:37.589091   48804 type.go:168] "Request Body" body=""
	I1201 19:26:37.589170   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:37.589433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.089395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:26:38.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:38.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:38.880960   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:38.943302   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:38.943343   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:38.943363   48804 retry.go:31] will retry after 13.228566353s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.089598   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.089672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.089960   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:39.090011   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:39.469200   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:39.525826   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:39.528963   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.528998   48804 retry.go:31] will retry after 17.183760318s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:39.589169   48804 type.go:168] "Request Body" body=""
	I1201 19:26:39.589241   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:39.589577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.089008   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.089433   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:40.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:26:40.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:40.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.089139   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.089214   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.089595   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:41.589301   48804 type.go:168] "Request Body" body=""
	I1201 19:26:41.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:41.589750   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:41.589806   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:42.089592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.089667   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.089940   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:42.589720   48804 type.go:168] "Request Body" body=""
	I1201 19:26:42.589791   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:42.590109   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.089757   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.089835   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.090111   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:43.589514   48804 type.go:168] "Request Body" body=""
	I1201 19:26:43.589585   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:43.589848   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:43.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:44.089653   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.089754   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:44.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:26:44.589895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:44.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:45.090381   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.090466   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.092630   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1201 19:26:45.589592   48804 type.go:168] "Request Body" body=""
	I1201 19:26:45.589673   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:45.590001   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:45.590051   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:46.089834   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.089916   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.090265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:46.588963   48804 type.go:168] "Request Body" body=""
	I1201 19:26:46.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:46.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.089324   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.089402   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.089734   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:47.589563   48804 type.go:168] "Request Body" body=""
	I1201 19:26:47.589642   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:47.590061   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:47.590178   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:48.089732   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.089808   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.090071   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:48.589851   48804 type.go:168] "Request Body" body=""
	I1201 19:26:48.589928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:48.590267   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.088857   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.088930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.089271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:49.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:49.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:49.590253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:49.590304   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:50.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:50.589030   48804 type.go:168] "Request Body" body=""
	I1201 19:26:50.589106   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:50.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.089232   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.089302   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.089614   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:51.589210   48804 type.go:168] "Request Body" body=""
	I1201 19:26:51.589283   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:51.589653   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:52.089565   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.089648   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.089984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:52.090044   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:52.172403   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:26:52.228163   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:52.231129   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.231163   48804 retry.go:31] will retry after 19.315790709s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:52.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:26:52.589726   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:52.589977   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.089744   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.089824   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:53.589859   48804 type.go:168] "Request Body" body=""
	I1201 19:26:53.589934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:53.590235   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:54.589137   48804 type.go:168] "Request Body" body=""
	I1201 19:26:54.589243   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:54.589618   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:54.589675   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:55.089338   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.089423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.089771   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:55.589523   48804 type.go:168] "Request Body" body=""
	I1201 19:26:55.589594   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:55.589856   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.089664   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.588804   48804 type.go:168] "Request Body" body=""
	I1201 19:26:56.588881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:56.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:56.713576   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:26:56.772710   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:26:56.775873   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:56.775910   48804 retry.go:31] will retry after 15.04087383s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:26:57.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.089334   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.089591   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:57.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:26:57.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:26:57.589000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:57.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:58.588867   48804 type.go:168] "Request Body" body=""
	I1201 19:26:58.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:58.589237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.088980   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.089051   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.089363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:26:59.589124   48804 type.go:168] "Request Body" body=""
	I1201 19:26:59.589220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:26:59.589536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:26:59.589590   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:00.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.089350   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.089679   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:00.589522   48804 type.go:168] "Request Body" body=""
	I1201 19:27:00.589597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:00.589979   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.088921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.089174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:01.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:01.588991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:01.589359   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:02.089003   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.089084   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.089441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:02.089520   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:02.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:27:02.588931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:02.589218   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:03.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:03.589160   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:03.589510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:04.089192   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.089265   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:04.089578   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:04.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:04.589355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.088902   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.088977   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:05.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:05.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:05.589296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.088932   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:06.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:27:06.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:06.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:06.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:07.088819   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.089191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:07.588888   48804 type.go:168] "Request Body" body=""
	I1201 19:27:07.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:07.589307   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:08.589807   48804 type.go:168] "Request Body" body=""
	I1201 19:27:08.589880   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:08.590129   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:08.590170   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:09.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.089269   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:09.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:27:09.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:09.589272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:27:10.589096   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:10.589428   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.089194   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.089643   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:11.089702   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:11.547197   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:11.589587   48804 type.go:168] "Request Body" body=""
	I1201 19:27:11.589653   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:11.589873   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:11.606598   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.609801   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.609839   48804 retry.go:31] will retry after 19.642669348s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.817534   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:11.881682   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:11.881743   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:11.881763   48804 retry.go:31] will retry after 44.665994167s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:12.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:12.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:27:12.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:12.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.088988   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:13.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:27:13.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:13.589344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:13.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:14.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:14.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:27:14.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:14.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.088989   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.089075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.089465   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:15.589182   48804 type.go:168] "Request Body" body=""
	I1201 19:27:15.589270   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:15.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:15.589609   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:16.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:16.588919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:16.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:16.589317   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.089377   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:17.588846   48804 type.go:168] "Request Body" body=""
	I1201 19:27:17.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:17.589232   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:18.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:18.089371   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:18.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:27:18.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:18.589350   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.088809   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.088891   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.089153   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:19.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:27:19.588895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:19.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:20.089911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.089989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.090331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:20.090392   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:20.589054   48804 type.go:168] "Request Body" body=""
	I1201 19:27:20.589132   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:20.589374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.089268   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.089343   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.089681   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:21.589436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:21.589535   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:21.589948   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.088847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.088935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.089210   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:22.588895   48804 type.go:168] "Request Body" body=""
	I1201 19:27:22.588975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:22.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:22.589363   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:23.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.089301   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:23.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:27:23.589747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:23.589992   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.089847   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.089932   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.090273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:24.588986   48804 type.go:168] "Request Body" body=""
	I1201 19:27:24.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:24.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:24.589445   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:25.089736   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.089809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.090059   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:25.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:27:25.588915   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:25.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.089346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:26.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:27:26.588967   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:26.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:27.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.089316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:27.089370   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:27.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:27:27.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:27.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.089038   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.089114   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:28.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:27:28.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:28.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:29.089044   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.089124   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.089459   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:29.089532   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:29.589183   48804 type.go:168] "Request Body" body=""
	I1201 19:27:29.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:29.589521   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.089020   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.089103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.089462   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:30.588937   48804 type.go:168] "Request Body" body=""
	I1201 19:27:30.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:30.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.088828   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.088907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.089239   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:31.252679   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:27:31.310178   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:31.313107   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.313144   48804 retry.go:31] will retry after 31.234541362s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1201 19:27:31.589652   48804 type.go:168] "Request Body" body=""
	I1201 19:27:31.589739   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:31.590099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:31.590157   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:32.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:32.589064   48804 type.go:168] "Request Body" body=""
	I1201 19:27:32.589140   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:32.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.089586   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:33.589302   48804 type.go:168] "Request Body" body=""
	I1201 19:27:33.589377   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:33.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:34.089480   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.089566   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.089825   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:34.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:34.589708   48804 type.go:168] "Request Body" body=""
	I1201 19:27:34.589788   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:34.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.088875   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.088959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:35.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:35.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:35.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.089209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:36.588881   48804 type.go:168] "Request Body" body=""
	I1201 19:27:36.588958   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:36.589291   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:36.589344   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:37.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.088942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.089244   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:37.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:37.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:37.589284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.088990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:38.589536   48804 type.go:168] "Request Body" body=""
	I1201 19:27:38.589614   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:38.589859   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:38.589897   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:39.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.089743   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.090090   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:39.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:27:39.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:39.590181   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.088979   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.089261   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:40.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:27:40.589033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:40.589335   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:41.089238   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.089312   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.089670   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:41.089726   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:41.589477   48804 type.go:168] "Request Body" body=""
	I1201 19:27:41.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:41.589816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.089787   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.089858   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.090183   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:42.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:27:42.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:42.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.088926   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.089322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:43.588918   48804 type.go:168] "Request Body" body=""
	I1201 19:27:43.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:43.589305   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:43.589360   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:44.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:44.589583   48804 type.go:168] "Request Body" body=""
	I1201 19:27:44.589664   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:44.589930   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.089936   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.090240   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:45.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:27:45.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:45.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:45.589423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:46.088911   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.089243   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:46.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:46.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:46.589328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.088993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.089287   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:47.588825   48804 type.go:168] "Request Body" body=""
	I1201 19:27:47.588900   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:47.589160   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:48.088919   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.089001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:48.089402   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:48.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:27:48.589148   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:48.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.089140   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.089204   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.089439   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:49.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:27:49.588992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:49.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:50.088977   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.089060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.089402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:50.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:50.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:27:50.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:50.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.089222   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.089296   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.089666   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:51.589233   48804 type.go:168] "Request Body" body=""
	I1201 19:27:51.589315   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:51.589663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:52.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.089519   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.089816   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:52.089874   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:52.589625   48804 type.go:168] "Request Body" body=""
	I1201 19:27:52.589697   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:52.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.089857   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.089935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.090294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:53.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:27:53.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:53.589254   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.089015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.089419   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:54.588992   48804 type.go:168] "Request Body" body=""
	I1201 19:27:54.589064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:54.589387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:54.589442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:55.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.088988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:55.589056   48804 type.go:168] "Request Body" body=""
	I1201 19:27:55.589135   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:55.589478   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.089109   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.548010   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1201 19:27:56.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:27:56.589039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:56.589293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:56.618422   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621596   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:27:56.621692   48804 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:27:57.089694   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.089774   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.090105   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:57.090156   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:27:57.588869   48804 type.go:168] "Request Body" body=""
	I1201 19:27:57.588942   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:57.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.089844   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.089911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.090167   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:58.588968   48804 type.go:168] "Request Body" body=""
	I1201 19:27:58.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:58.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.089080   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.089152   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.089448   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:27:59.589149   48804 type.go:168] "Request Body" body=""
	I1201 19:27:59.589228   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:27:59.589504   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:27:59.589556   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:00.089006   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:00.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:00.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:00.589383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.089210   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.089282   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:01.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:28:01.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:01.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.088960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.089036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:02.089423   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:02.547921   48804 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1201 19:28:02.588962   48804 type.go:168] "Request Body" body=""
	I1201 19:28:02.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:02.589300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:02.609226   48804 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612351   48804 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1201 19:28:02.612446   48804 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1201 19:28:02.615606   48804 out.go:179] * Enabled addons: 
	I1201 19:28:02.619164   48804 addons.go:530] duration metric: took 1m51.54309696s for enable addons: enabled=[]
	I1201 19:28:03.089670   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.090185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:03.588958   48804 type.go:168] "Request Body" body=""
	I1201 19:28:03.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:03.589361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.089034   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.089110   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:04.588949   48804 type.go:168] "Request Body" body=""
	I1201 19:28:04.589049   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:04.589402   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:04.589461   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:05.089449   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.089546   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.089857   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:05.589588   48804 type.go:168] "Request Body" body=""
	I1201 19:28:05.589671   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:05.589935   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.089746   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.089819   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.090155   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:06.588853   48804 type.go:168] "Request Body" body=""
	I1201 19:28:06.588925   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:06.589422   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:07.089306   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.089384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.089671   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:07.089725   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:07.589476   48804 type.go:168] "Request Body" body=""
	I1201 19:28:07.589563   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:07.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.089665   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.089738   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.090110   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:08.589762   48804 type.go:168] "Request Body" body=""
	I1201 19:28:08.589829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:08.590138   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:09.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:28:09.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:09.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:09.589404   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:10.089052   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.089126   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:10.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:28:10.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:10.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.089232   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.089589   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:11.589170   48804 type.go:168] "Request Body" body=""
	I1201 19:28:11.589250   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:11.589715   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:11.589763   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:12.089752   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.089829   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:12.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:28:12.588998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:12.589379   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.089832   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.089899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.090285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:13.588827   48804 type.go:168] "Request Body" body=""
	I1201 19:28:13.588899   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:13.589250   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:14.088849   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.089292   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:14.089362   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:14.589658   48804 type.go:168] "Request Body" body=""
	I1201 19:28:14.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:14.589982   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.089907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.089992   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.090441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:15.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:28:15.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:15.589364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:16.088968   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.089055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.089536   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:16.089598   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:16.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:16.589013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:16.589342   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.088906   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.088984   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.089298   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:17.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:28:17.589065   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:17.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.088959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:18.588915   48804 type.go:168] "Request Body" body=""
	I1201 19:28:18.588988   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:18.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:18.589385   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:19.089029   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:19.588898   48804 type.go:168] "Request Body" body=""
	I1201 19:28:19.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:19.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.089123   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.089516   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:20.588858   48804 type.go:168] "Request Body" body=""
	I1201 19:28:20.588926   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:20.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:21.089156   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.089230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:21.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:21.588946   48804 type.go:168] "Request Body" body=""
	I1201 19:28:21.589024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:21.589356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.089252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:22.588847   48804 type.go:168] "Request Body" body=""
	I1201 19:28:22.588920   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:22.589241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.089024   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:23.588809   48804 type.go:168] "Request Body" body=""
	I1201 19:28:23.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:23.589219   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:23.589269   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:24.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.089373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:24.588960   48804 type.go:168] "Request Body" body=""
	I1201 19:28:24.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:24.589427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.089763   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.089831   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.090097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:25.589881   48804 type.go:168] "Request Body" body=""
	I1201 19:28:25.589959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:25.590297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:25.590357   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:26.089013   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.089528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:26.589214   48804 type.go:168] "Request Body" body=""
	I1201 19:28:26.589286   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:26.589603   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.089467   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.089559   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.089881   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:27.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:27.589752   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:27.590104   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:28.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.089776   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.090051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:28.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:28.589863   48804 type.go:168] "Request Body" body=""
	I1201 19:28:28.589941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:28.590271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.089376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:29.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:28:29.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:29.589270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.088976   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.089064   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.089446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:30.589171   48804 type.go:168] "Request Body" body=""
	I1201 19:28:30.589249   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:30.589613   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:30.589671   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:31.089382   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.089449   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.089763   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:31.589556   48804 type.go:168] "Request Body" body=""
	I1201 19:28:31.589638   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:31.589939   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.088836   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.089242   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:32.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:28:32.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:32.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:33.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.089356   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:33.089416   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:33.588938   48804 type.go:168] "Request Body" body=""
	I1201 19:28:33.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.088874   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.089304   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:34.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:34.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:34.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.089364   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:35.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:28:35.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:35.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:35.589306   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:36.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.089000   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.089328   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:36.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:28:36.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:36.589327   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.088865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.089234   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:37.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:28:37.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:37.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:37.589407   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:38.089085   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.089167   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.089517   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:38.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:28:38.588949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:38.589220   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.089344   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:39.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:28:39.589011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:39.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:40.096455   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.096551   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.096874   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:40.097064   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:40.589786   48804 type.go:168] "Request Body" body=""
	I1201 19:28:40.589855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:40.590188   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.089116   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.089196   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.089535   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:41.589129   48804 type.go:168] "Request Body" body=""
	I1201 19:28:41.589203   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:41.589458   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.089448   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.089553   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.089900   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:42.589577   48804 type.go:168] "Request Body" body=""
	I1201 19:28:42.589661   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:42.590007   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:42.590065   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:43.089576   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.089651   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.089904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:43.589673   48804 type.go:168] "Request Body" body=""
	I1201 19:28:43.589746   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:43.590046   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.089837   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.089907   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.090256   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:44.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:28:44.588933   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:44.589199   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:45.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.089331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:45.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:45.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:28:45.589171   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:45.589562   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.089851   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.089921   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.090252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:46.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:28:46.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:46.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:47.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.089037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.089393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:47.089451   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:47.588861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:47.588928   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:47.589192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.088891   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.089303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:48.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:28:48.589063   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:48.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:49.089120   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.089200   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.089463   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:49.089529   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:49.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:49.588993   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:49.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.088816   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.088895   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.089241   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:50.588922   48804 type.go:168] "Request Body" body=""
	I1201 19:28:50.588987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:50.589245   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:51.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.089220   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:51.089605   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:51.589292   48804 type.go:168] "Request Body" body=""
	I1201 19:28:51.589374   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:51.589732   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.089536   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.089603   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.089870   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:52.589721   48804 type.go:168] "Request Body" body=""
	I1201 19:28:52.589798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:52.590135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.088861   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.089284   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:53.588978   48804 type.go:168] "Request Body" body=""
	I1201 19:28:53.589055   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:53.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:53.589377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:54.088970   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.089061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.089555   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:54.589299   48804 type.go:168] "Request Body" body=""
	I1201 19:28:54.589391   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:54.589805   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.089595   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.089665   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.089924   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:55.589675   48804 type.go:168] "Request Body" body=""
	I1201 19:28:55.589751   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:55.590051   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:55.590097   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:56.089729   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.089807   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.090169   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:56.589823   48804 type.go:168] "Request Body" body=""
	I1201 19:28:56.589890   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:56.590185   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.088918   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.089318   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:57.589032   48804 type.go:168] "Request Body" body=""
	I1201 19:28:57.589112   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:57.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:58.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.089269   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.089543   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:28:58.089583   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:28:58.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:28:58.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:58.589352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:28:59.589585   48804 type.go:168] "Request Body" body=""
	I1201 19:28:59.589652   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:28:59.589904   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:00.090091   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.090176   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.090503   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:00.090549   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:00.589349   48804 type.go:168] "Request Body" body=""
	I1201 19:29:00.589423   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:00.589759   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.089644   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.089715   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.089978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:01.589828   48804 type.go:168] "Request Body" body=""
	I1201 19:29:01.589917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:01.590306   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:02.588896   48804 type.go:168] "Request Body" body=""
	I1201 19:29:02.588963   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:02.589271   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:02.589323   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:03.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:03.589098   48804 type.go:168] "Request Body" body=""
	I1201 19:29:03.589185   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:03.589576   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.089251   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.089329   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.089606   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:04.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:29:04.588973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:04.589278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:05.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:05.089378   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:05.588996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:05.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:05.589369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:06.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:06.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:06.589275   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.088820   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.088892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.089135   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:07.589860   48804 type.go:168] "Request Body" body=""
	I1201 19:29:07.589935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:07.590230   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:07.590276   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:08.088928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.089375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:08.588887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:08.588960   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:08.589213   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.088905   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.088991   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.089309   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:09.589025   48804 type.go:168] "Request Body" body=""
	I1201 19:29:09.589102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:09.589421   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:10.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.089125   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:10.089434   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:10.589102   48804 type.go:168] "Request Body" body=""
	I1201 19:29:10.589179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:10.589460   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.089329   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.089406   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.089844   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:11.589591   48804 type.go:168] "Request Body" body=""
	I1201 19:29:11.589659   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:11.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.088917   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:12.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:29:12.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:12.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:12.589414   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:13.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:13.588959   48804 type.go:168] "Request Body" body=""
	I1201 19:29:13.589047   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:13.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.089033   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.089105   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.089449   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:14.588871   48804 type.go:168] "Request Body" body=""
	I1201 19:29:14.588938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:14.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:15.089001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.089392   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:15.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:15.589096   48804 type.go:168] "Request Body" body=""
	I1201 19:29:15.589199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:15.589514   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.089742   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.089812   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.090072   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:16.589844   48804 type.go:168] "Request Body" body=""
	I1201 19:29:16.589924   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:16.590265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:17.088934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:17.089471   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:17.589173   48804 type.go:168] "Request Body" body=""
	I1201 19:29:17.589246   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:17.589526   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.088963   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.089323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:18.589022   48804 type.go:168] "Request Body" body=""
	I1201 19:29:18.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:18.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.088922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.089208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:19.588956   48804 type.go:168] "Request Body" body=""
	I1201 19:29:19.589034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:19.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:19.589431   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:20.089101   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.089182   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.089476   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:20.588868   48804 type.go:168] "Request Body" body=""
	I1201 19:29:20.588935   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:20.589182   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.089165   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.089236   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.089546   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:21.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:21.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:21.589316   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:22.089229   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.089646   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:22.089715   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:22.589537   48804 type.go:168] "Request Body" body=""
	I1201 19:29:22.589607   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:22.589906   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.089700   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.089798   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.090113   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:23.589764   48804 type.go:168] "Request Body" body=""
	I1201 19:29:23.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:23.590144   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.088883   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.088952   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.089296   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:24.589001   48804 type.go:168] "Request Body" body=""
	I1201 19:29:24.589080   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:24.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:24.589410   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:25.088898   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.088976   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:25.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:29:25.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:25.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.089032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.089398   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:26.588893   48804 type.go:168] "Request Body" body=""
	I1201 19:29:26.588972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:26.589273   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:27.088957   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.089025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:27.089379   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:27.589046   48804 type.go:168] "Request Body" body=""
	I1201 19:29:27.589122   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:27.589420   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.088944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.089204   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:28.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:29:28.589028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:28.589360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:29.089056   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.089134   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.089452   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:29.089528   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:29.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:29:29.589233   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:29.589511   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.088929   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.089013   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.089391   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:30.588935   48804 type.go:168] "Request Body" body=""
	I1201 19:29:30.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:30.589289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:31.089148   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.089217   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.089510   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:31.089554   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:31.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:29:31.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:31.589395   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.089002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.089349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:32.589582   48804 type.go:168] "Request Body" body=""
	I1201 19:29:32.589657   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:32.589912   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:33.089719   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.089796   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.090165   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:33.090228   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:33.588933   48804 type.go:168] "Request Body" body=""
	I1201 19:29:33.589020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:33.589368   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.089059   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.089141   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.089472   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:34.588913   48804 type.go:168] "Request Body" body=""
	I1201 19:29:34.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:34.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.089077   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.089208   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.089761   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:35.589549   48804 type.go:168] "Request Body" body=""
	I1201 19:29:35.589624   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:35.589888   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:35.589927   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:36.089659   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.089734   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.090095   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:36.588832   48804 type.go:168] "Request Body" body=""
	I1201 19:29:36.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:36.589251   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.088965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.089289   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:37.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:29:37.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:37.589346   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:38.088942   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.089408   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:38.089459   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:38.588843   48804 type.go:168] "Request Body" body=""
	I1201 19:29:38.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:38.589178   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.088880   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.088961   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.089264   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:39.588969   48804 type.go:168] "Request Body" body=""
	I1201 19:29:39.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:39.589385   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.088901   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.089312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:40.588965   48804 type.go:168] "Request Body" body=""
	I1201 19:29:40.589041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:40.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:40.589403   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:41.089288   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.089366   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.089704   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:41.589423   48804 type.go:168] "Request Body" body=""
	I1201 19:29:41.589506   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:41.589815   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.089782   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.089864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.090168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:29:42.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:42.589534   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:42.589596   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:43.089242   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.089310   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.089663   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:43.589454   48804 type.go:168] "Request Body" body=""
	I1201 19:29:43.589549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:43.589901   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.089759   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.089838   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.090150   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:44.588838   48804 type.go:168] "Request Body" body=""
	I1201 19:29:44.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:44.589175   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:45.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.089091   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.089396   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:45.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:45.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:29:45.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:45.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.088887   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.089311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:46.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:29:46.589071   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:46.589393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:47.089406   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.089500   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.089826   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:47.089884   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:47.589600   48804 type.go:168] "Request Body" body=""
	I1201 19:29:47.589672   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:47.589966   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.089769   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.090162   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:48.588884   48804 type.go:168] "Request Body" body=""
	I1201 19:29:48.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:48.589326   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.089089   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.089367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:49.588995   48804 type.go:168] "Request Body" body=""
	I1201 19:29:49.589073   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:49.589417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:49.589467   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:50.089159   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.089254   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.089647   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:50.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:29:50.588947   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:50.589215   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.089071   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.089145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.089475   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:51.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:29:51.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:51.589365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:52.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.089238   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:52.089288   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:52.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:29:52.589008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:52.589313   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.089355   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:53.589750   48804 type.go:168] "Request Body" body=""
	I1201 19:29:53.589814   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:53.590123   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:54.089823   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.089898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.090247   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:54.090303   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:54.588852   48804 type.go:168] "Request Body" body=""
	I1201 19:29:54.588930   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:54.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.088914   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.089270   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:55.588966   48804 type.go:168] "Request Body" body=""
	I1201 19:29:55.589042   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:55.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.089360   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:56.589038   48804 type.go:168] "Request Body" body=""
	I1201 19:29:56.589104   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:56.589401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:56.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:57.089011   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.089090   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:57.588994   48804 type.go:168] "Request Body" body=""
	I1201 19:29:57.589111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:57.589436   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.089014   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.089087   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.089394   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:58.588939   48804 type.go:168] "Request Body" body=""
	I1201 19:29:58.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:58.589358   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:29:59.088907   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.088987   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.089299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:29:59.089369   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:29:59.589697   48804 type.go:168] "Request Body" body=""
	I1201 19:29:59.589768   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:29:59.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.089253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:00.588991   48804 type.go:168] "Request Body" body=""
	I1201 19:30:00.589075   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:00.589446   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:01.089610   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.089745   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:01.090102   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:01.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:01.589966   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:01.590319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.089172   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.089260   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.089600   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:02.588910   48804 type.go:168] "Request Body" body=""
	I1201 19:30:02.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:02.589282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.088937   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.089334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:03.588977   48804 type.go:168] "Request Body" body=""
	I1201 19:30:03.589052   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:03.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:03.589478   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:04.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.089038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.089352   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:04.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:30:04.589010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:04.589299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.088972   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:05.589752   48804 type.go:168] "Request Body" body=""
	I1201 19:30:05.589827   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:05.590136   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:05.590195   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:06.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:06.588908   48804 type.go:168] "Request Body" body=""
	I1201 19:30:06.588989   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:06.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.088896   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.088973   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.089282   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:07.588967   48804 type.go:168] "Request Body" body=""
	I1201 19:30:07.589037   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:07.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:08.089094   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.089179   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.089559   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:08.089615   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:08.589268   48804 type.go:168] "Request Body" body=""
	I1201 19:30:08.589341   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:08.589676   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.089519   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.089597   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.089926   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:09.589719   48804 type.go:168] "Request Body" body=""
	I1201 19:30:09.589797   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:09.590134   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.088842   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.088923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.089248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:10.588961   48804 type.go:168] "Request Body" body=""
	I1201 19:30:10.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:10.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:10.589466   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:11.089455   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.089549   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.089928   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:11.589660   48804 type.go:168] "Request Body" body=""
	I1201 19:30:11.589731   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:11.589984   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.089097   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.089199   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.089561   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:12.589383   48804 type.go:168] "Request Body" body=""
	I1201 19:30:12.589475   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:12.589880   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:12.589952   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:13.089681   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.089750   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.090058   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:13.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:13.589929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:13.590299   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.088944   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:14.588863   48804 type.go:168] "Request Body" body=""
	I1201 19:30:14.588937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:14.589280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:15.088982   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.089066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.089386   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:15.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:15.589628   48804 type.go:168] "Request Body" body=""
	I1201 19:30:15.589698   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:15.590008   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.089799   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.090158   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:16.588891   48804 type.go:168] "Request Body" body=""
	I1201 19:30:16.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:16.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.089018   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:17.588824   48804 type.go:168] "Request Body" body=""
	I1201 19:30:17.588902   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:17.589252   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:17.589312   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:18.088965   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.089050   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:18.589104   48804 type.go:168] "Request Body" body=""
	I1201 19:30:18.589181   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:18.589539   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.089070   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.089333   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:19.589020   48804 type.go:168] "Request Body" body=""
	I1201 19:30:19.589098   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:19.589410   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:19.589458   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:20.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.089031   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.089400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:20.589161   48804 type.go:168] "Request Body" body=""
	I1201 19:30:20.589230   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:20.589528   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.089246   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.089319   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.089743   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:21.589336   48804 type.go:168] "Request Body" body=""
	I1201 19:30:21.589427   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:21.589837   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:21.589900   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:22.089716   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.089803   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.090099   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:22.589890   48804 type.go:168] "Request Body" body=""
	I1201 19:30:22.589969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:22.590315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.088903   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.088983   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:23.588820   48804 type.go:168] "Request Body" body=""
	I1201 19:30:23.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:23.589157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:24.088933   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.089010   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.089362   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:24.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:24.589099   48804 type.go:168] "Request Body" body=""
	I1201 19:30:24.589172   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:24.589544   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.089055   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.089127   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.089434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:25.588952   48804 type.go:168] "Request Body" body=""
	I1201 19:30:25.589026   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:25.589347   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:26.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.089020   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:26.089524   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:26.588879   48804 type.go:168] "Request Body" body=""
	I1201 19:30:26.588945   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:26.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.088899   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.088972   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.089314   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:27.589053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:27.589130   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:27.589456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.088844   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.088911   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.089168   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:28.588916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:28.589017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:28.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:28.589390   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:29.088940   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.089009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:29.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:30:29.589072   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:29.589384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.088996   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.089095   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.089945   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:30.589738   48804 type.go:168] "Request Body" body=""
	I1201 19:30:30.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:30.590195   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:30.590251   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:31.089045   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.089111   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.089438   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:31.588985   48804 type.go:168] "Request Body" body=""
	I1201 19:30:31.589056   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:31.589357   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.088915   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.089324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:32.588989   48804 type.go:168] "Request Body" body=""
	I1201 19:30:32.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:32.589324   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:33.088946   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.089016   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.089384   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:33.089440   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:33.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:33.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:33.589343   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.089030   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.089456   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:34.588957   48804 type.go:168] "Request Body" body=""
	I1201 19:30:34.589038   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:34.589373   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:35.089090   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.089168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.089549   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:35.089625   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:35.588821   48804 type.go:168] "Request Body" body=""
	I1201 19:30:35.588898   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:35.589161   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.088888   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.088971   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.089321   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:36.589745   48804 type.go:168] "Request Body" body=""
	I1201 19:30:36.589817   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:36.590097   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:37.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.089699   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.089969   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:37.090012   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:37.589546   48804 type.go:168] "Request Body" body=""
	I1201 19:30:37.589637   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:37.589963   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.089733   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.089804   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.090142   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:38.589803   48804 type.go:168] "Request Body" body=""
	I1201 19:30:38.589876   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:38.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.088904   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.088981   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.089329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:39.589036   48804 type.go:168] "Request Body" body=""
	I1201 19:30:39.589107   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:39.589441   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:39.589515   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:40.088909   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.088997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.089345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:40.589047   48804 type.go:168] "Request Body" body=""
	I1201 19:30:40.589120   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:40.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.089436   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.089564   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.089897   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:41.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:30:41.589633   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:41.589911   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:41.589956   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:42.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.088937   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.089280   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:42.588911   48804 type.go:168] "Request Body" body=""
	I1201 19:30:42.588990   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:42.589331   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.088953   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.089365   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:43.588921   48804 type.go:168] "Request Body" body=""
	I1201 19:30:43.588995   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:43.589312   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:44.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.089017   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.089337   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:44.089388   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:44.589855   48804 type.go:168] "Request Body" body=""
	I1201 19:30:44.589923   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:44.590187   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.089403   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:45.589135   48804 type.go:168] "Request Body" body=""
	I1201 19:30:45.589226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:45.589637   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.088870   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.088951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.089279   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:46.588932   48804 type.go:168] "Request Body" body=""
	I1201 19:30:46.589004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:46.589345   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:46.589399   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:47.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.088994   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.089351   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:47.588873   48804 type.go:168] "Request Body" body=""
	I1201 19:30:47.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:47.589265   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.089014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:48.589110   48804 type.go:168] "Request Body" body=""
	I1201 19:30:48.589189   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:48.589550   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:48.589608   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:49.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.089255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:49.588894   48804 type.go:168] "Request Body" body=""
	I1201 19:30:49.588965   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:49.589274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.089382   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:50.588817   48804 type.go:168] "Request Body" body=""
	I1201 19:30:50.588886   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:50.589146   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:51.089119   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.089223   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.089571   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:51.089630   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:51.589291   48804 type.go:168] "Request Body" body=""
	I1201 19:30:51.589384   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:51.589728   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.089674   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.089747   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.090013   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:52.589771   48804 type.go:168] "Request Body" body=""
	I1201 19:30:52.589847   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:52.590191   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:53.089897   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.089975   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.090297   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:53.090359   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:53.589788   48804 type.go:168] "Request Body" body=""
	I1201 19:30:53.589869   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:53.590118   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.088938   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.089272   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:54.588948   48804 type.go:168] "Request Body" body=""
	I1201 19:30:54.589019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:54.589363   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.088956   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.089040   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.089401   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:55.588927   48804 type.go:168] "Request Body" body=""
	I1201 19:30:55.588997   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:55.589329   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:55.589383   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:56.089053   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.089147   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.089578   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:56.588865   48804 type.go:168] "Request Body" body=""
	I1201 19:30:56.588939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:56.589253   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.088920   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.088996   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.089302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:57.588942   48804 type.go:168] "Request Body" body=""
	I1201 19:30:57.589030   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:57.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:58.089706   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.089773   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.090032   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:30:58.090073   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:30:58.589805   48804 type.go:168] "Request Body" body=""
	I1201 19:30:58.589877   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:58.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.088885   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.088954   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.089285   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:30:59.589709   48804 type.go:168] "Request Body" body=""
	I1201 19:30:59.589783   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:30:59.590045   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:00.089976   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.090062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.090455   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:00.090523   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:00.588940   48804 type.go:168] "Request Body" body=""
	I1201 19:31:00.589022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:00.589325   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.089193   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.089258   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.089567   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:01.589248   48804 type.go:168] "Request Body" body=""
	I1201 19:31:01.589320   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:01.589696   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.089617   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.089689   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.090033   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:02.589742   48804 type.go:168] "Request Body" body=""
	I1201 19:31:02.589809   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:02.590065   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:02.590107   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:03.089840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.089919   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.090274   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:03.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:03.588964   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:03.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.088868   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.088940   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.089202   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:04.588907   48804 type.go:168] "Request Body" body=""
	I1201 19:31:04.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:04.589308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:05.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.089004   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.089341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:05.089397   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:05.589817   48804 type.go:168] "Request Body" body=""
	I1201 19:31:05.589881   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:05.590139   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.088823   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.088913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.089226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:06.588928   48804 type.go:168] "Request Body" body=""
	I1201 19:31:06.589003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:06.589354   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.088881   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.088956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.089268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:07.589804   48804 type.go:168] "Request Body" body=""
	I1201 19:31:07.589883   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:07.590226   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:07.590283   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:08.088945   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.089021   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.089427   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:08.588851   48804 type.go:168] "Request Body" body=""
	I1201 19:31:08.588922   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:08.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.088893   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.089315   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:09.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:09.588982   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:09.589323   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:10.088978   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.089059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.089383   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:10.089436   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:10.589014   48804 type.go:168] "Request Body" body=""
	I1201 19:31:10.589086   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:10.589443   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.089293   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.089375   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.089754   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:11.588862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:11.588934   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:11.589248   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:12.088938   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.089019   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.089414   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:12.089477   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:12.588943   48804 type.go:168] "Request Body" body=""
	I1201 19:31:12.589029   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:12.589424   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.088863   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.088931   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.089236   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:13.588944   48804 type.go:168] "Request Body" body=""
	I1201 19:31:13.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:13.589400   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.088981   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.089062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.089389   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:14.589792   48804 type.go:168] "Request Body" body=""
	I1201 19:31:14.589864   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:14.590157   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:14.590205   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:15.089917   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.089998   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.090393   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:15.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:15.589043   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:15.589442   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.089755   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.089823   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.090149   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:16.588885   48804 type.go:168] "Request Body" body=""
	I1201 19:31:16.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:16.589268   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:17.088952   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.089027   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:17.089421   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:17.589041   48804 type.go:168] "Request Body" body=""
	I1201 19:31:17.589117   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:17.589376   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.088925   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.089022   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.089353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:18.589024   48804 type.go:168] "Request Body" body=""
	I1201 19:31:18.589103   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:18.589390   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.089036   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:19.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:19.589006   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:19.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:19.589433   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:20.088966   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.089045   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.089415   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:20.589821   48804 type.go:168] "Request Body" body=""
	I1201 19:31:20.589892   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:20.590189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.089147   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.089226   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.089557   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:21.588941   48804 type.go:168] "Request Body" body=""
	I1201 19:31:21.589012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:21.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:22.089347   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.089422   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.089710   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:22.089757   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:22.589558   48804 type.go:168] "Request Body" body=""
	I1201 19:31:22.589640   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:22.589978   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.089774   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.089855   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.090209   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:23.589840   48804 type.go:168] "Request Body" body=""
	I1201 19:31:23.589913   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:23.590166   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.089300   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:24.588860   48804 type.go:168] "Request Body" body=""
	I1201 19:31:24.588943   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:24.589281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:24.589334   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:25.089831   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.089896   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.090189   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:25.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:25.588959   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:25.589302   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.088866   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.088948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.089281   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:26.589601   48804 type.go:168] "Request Body" body=""
	I1201 19:31:26.589668   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:26.589943   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:26.589982   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:27.088858   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.088939   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.089293   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:27.588870   48804 type.go:168] "Request Body" body=""
	I1201 19:31:27.588951   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:27.589303   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.088862   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.089205   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:28.588930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:28.589061   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:28.589381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:29.088916   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.089003   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:29.089377   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:29.588973   48804 type.go:168] "Request Body" body=""
	I1201 19:31:29.589059   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:29.589349   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.088974   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.089053   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.089429   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:30.588993   48804 type.go:168] "Request Body" body=""
	I1201 19:31:30.589066   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:30.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:31.089200   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.089274   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.089577   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:31.089637   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:31.588904   48804 type.go:168] "Request Body" body=""
	I1201 19:31:31.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:31.589330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.089264   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.089340   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.089680   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:32.589442   48804 type.go:168] "Request Body" body=""
	I1201 19:31:32.589524   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:32.589781   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:33.089603   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.089675   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.089988   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:33.090052   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:33.589773   48804 type.go:168] "Request Body" body=""
	I1201 19:31:33.589845   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:33.590174   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.089801   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.089871   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.090171   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:34.588903   48804 type.go:168] "Request Body" body=""
	I1201 19:31:34.588980   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:34.589294   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.088936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.089011   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.089369   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:35.589740   48804 type.go:168] "Request Body" body=""
	I1201 19:31:35.589810   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:35.590064   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:35.590105   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:36.089859   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.089929   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.090255   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:36.588923   48804 type.go:168] "Request Body" body=""
	I1201 19:31:36.589001   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:36.589334   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.088876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.088941   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.089192   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:37.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:31:37.588956   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:37.589311   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:38.088964   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.089041   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:38.089442   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:38.588876   48804 type.go:168] "Request Body" body=""
	I1201 19:31:38.588953   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:38.589211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.088951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.089039   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:39.588897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:39.588978   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:39.589322   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.089017   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.089088   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.089380   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:40.589633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:40.589707   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:40.590026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:40.590081   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:41.089028   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.089102   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.089423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:41.589101   48804 type.go:168] "Request Body" body=""
	I1201 19:31:41.589178   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:41.589434   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.089390   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.089474   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.089854   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:42.589650   48804 type.go:168] "Request Body" body=""
	I1201 19:31:42.589727   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:42.590091   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:42.590148   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:43.089748   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.089825   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.090133   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:43.588864   48804 type.go:168] "Request Body" body=""
	I1201 19:31:43.588944   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:43.589249   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.088947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.089028   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.089361   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:44.588882   48804 type.go:168] "Request Body" body=""
	I1201 19:31:44.588948   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:44.589201   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:45.088921   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.089330   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:45.089382   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:45.589179   48804 type.go:168] "Request Body" body=""
	I1201 19:31:45.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:45.589564   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.089258   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.089345   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.089648   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:46.589361   48804 type.go:168] "Request Body" body=""
	I1201 19:31:46.589436   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:46.589775   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:47.089602   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.089682   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.090003   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:47.090068   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:47.589765   48804 type.go:168] "Request Body" body=""
	I1201 19:31:47.589836   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:47.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.088843   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.088918   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.089233   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:48.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:48.589032   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:48.589416   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.089114   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.089186   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.089669   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:49.589463   48804 type.go:168] "Request Body" body=""
	I1201 19:31:49.589550   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:49.589841   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:49.589889   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:50.089633   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.089706   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.090067   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:50.589698   48804 type.go:168] "Request Body" body=""
	I1201 19:31:50.589782   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:50.590096   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.089162   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.089244   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.089563   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:51.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:51.589007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:51.589348   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:52.088879   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.088949   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.089211   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:52.089255   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:52.588951   48804 type.go:168] "Request Body" body=""
	I1201 19:31:52.589060   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:52.589479   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.088939   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.089012   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:53.589094   48804 type.go:168] "Request Body" body=""
	I1201 19:31:53.589168   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:53.589423   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:54.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.089381   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:54.089441   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:54.588947   48804 type.go:168] "Request Body" body=""
	I1201 19:31:54.589025   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:54.589375   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.088930   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.088999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.089276   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:55.588954   48804 type.go:168] "Request Body" body=""
	I1201 19:31:55.589035   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:55.589378   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:56.088961   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.089044   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.089405   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:56.089463   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:56.588877   48804 type.go:168] "Request Body" body=""
	I1201 19:31:56.588950   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:56.589212   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.088999   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.089074   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.089387   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:57.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:31:57.589009   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:57.589339   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.088871   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.088974   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.089278   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:58.588934   48804 type.go:168] "Request Body" body=""
	I1201 19:31:58.589014   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:58.589453   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:31:58.589546   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:31:59.088897   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.088969   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.089308   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:31:59.588990   48804 type.go:168] "Request Body" body=""
	I1201 19:31:59.589062   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:31:59.589367   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.089002   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.089081   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.089412   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:00.589613   48804 type.go:168] "Request Body" body=""
	I1201 19:32:00.589705   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:00.590100   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:00.590166   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:01.088830   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.088901   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.089237   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:01.588936   48804 type.go:168] "Request Body" body=""
	I1201 19:32:01.589015   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:01.589341   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.089351   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.089432   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.089784   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:02.589531   48804 type.go:168] "Request Body" body=""
	I1201 19:32:02.589609   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:02.589892   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:03.089722   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.089794   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.090159   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:03.090212   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:03.588901   48804 type.go:168] "Request Body" body=""
	I1201 19:32:03.588985   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:03.589338   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.089007   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.089319   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:04.588917   48804 type.go:168] "Request Body" body=""
	I1201 19:32:04.588999   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:04.589336   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.089026   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.089164   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.089649   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:05.589075   48804 type.go:168] "Request Body" body=""
	I1201 19:32:05.589145   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:05.589411   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:05.589452   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:06.088935   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.089008   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.089370   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:06.588925   48804 type.go:168] "Request Body" body=""
	I1201 19:32:06.589036   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:06.589353   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.089795   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.089860   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.090124   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:07.588839   48804 type.go:168] "Request Body" body=""
	I1201 19:32:07.588910   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:07.589229   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:08.088955   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.089033   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.089374   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:08.089432   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:08.588931   48804 type.go:168] "Request Body" body=""
	I1201 19:32:08.589002   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:08.589283   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.088948   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.089034   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.089417   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:09.589158   48804 type.go:168] "Request Body" body=""
	I1201 19:32:09.589251   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:09.589644   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:10.089588   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.089666   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.090026   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1201 19:32:10.090112   48804 node_ready.go:55] error getting node "functional-428744" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-428744": dial tcp 192.168.49.2:8441: connect: connection refused
	I1201 19:32:10.588810   48804 type.go:168] "Request Body" body=""
	I1201 19:32:10.588889   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:10.589228   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.089103   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.089180   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.089540   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:11.588878   48804 type.go:168] "Request Body" body=""
	I1201 19:32:11.588946   48804 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-428744" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1201 19:32:11.589208   48804 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1201 19:32:12.088890   48804 type.go:168] "Request Body" body=""
	I1201 19:32:12.089251   48804 node_ready.go:38] duration metric: took 6m0.000540563s for node "functional-428744" to be "Ready" ...
	I1201 19:32:12.092425   48804 out.go:203] 
	W1201 19:32:12.095253   48804 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1201 19:32:12.095277   48804 out.go:285] * 
	W1201 19:32:12.097463   48804 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:32:12.100606   48804 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:19 functional-428744 containerd[5833]: time="2025-12-01T19:32:19.453990081Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.487184870Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.489388253Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.502137516Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:20 functional-428744 containerd[5833]: time="2025-12-01T19:32:20.502566149Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.487479497Z" level=info msg="No images store for sha256:9a78e6e24df19d4b5ee9819f74178ce844a778e46ad5f9dc53101feb167831e4"
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.490112047Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-428744\""
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.497324365Z" level=info msg="ImageCreate event name:\"sha256:4f3a5d641d9b7a5007231441eda3adf17b6874d8b72429dc7a44618c67a293d6\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:21 functional-428744 containerd[5833]: time="2025-12-01T19:32:21.497866456Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-428744\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.289150852Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.291773015Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.293690572Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 01 19:32:22 functional-428744 containerd[5833]: time="2025-12-01T19:32:22.305668049Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.267444727Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.269891231Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.271821161Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.292828735Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.400987063Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.403215357Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.410469710Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.411816327Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.523705796Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.526083600Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.533210046Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:32:23 functional-428744 containerd[5833]: time="2025-12-01T19:32:23.533572556Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:32:27.650042    9961 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:27.650606    9961 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:27.652527    9961 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:27.653114    9961 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:32:27.654829    9961 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:32:27 up  1:14,  0 user,  load average: 0.42, 0.33, 0.59
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:32:24 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 826.
	Dec 01 19:32:25 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:25 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:25 functional-428744 kubelet[9768]: E1201 19:32:25.165328    9768 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Dec 01 19:32:25 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:25 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:25 functional-428744 kubelet[9834]: E1201 19:32:25.918138    9834 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:25 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:26 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 828.
	Dec 01 19:32:26 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:26 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:26 functional-428744 kubelet[9855]: E1201 19:32:26.652746    9855 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:26 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:26 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:32:27 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 829.
	Dec 01 19:32:27 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:27 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:32:27 functional-428744 kubelet[9889]: E1201 19:32:27.401921    9889 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:32:27 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:32:27 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (345.868825ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly (2.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig (737.97s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-428744 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1201 19:34:27.088053    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:36:46.979140    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:38:10.044154    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:39:27.088088    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:41:46.979817    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:44:27.087783    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-428744 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: exit status 109 (12m15.562864304s)

                                                
                                                
-- stdout --
	* [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	  - apiserver.enable-admission-plugins=NamespaceAutoProvision
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000165379s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:774: failed to restart minikube. args "out/minikube-linux-arm64 start -p functional-428744 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all": exit status 109
functional_test.go:776: restart took 12m15.564116517s for "functional-428744" cluster.
I1201 19:44:44.192118    4305 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (310.958823ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 logs -n 25: (1.134614875s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-019259 image ls --format short --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format json --alsologtostderr                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format table --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh     │ functional-019259 ssh pgrep buildkitd                                                                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image   │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                  │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls                                                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete  │ -p functional-019259                                                                                                                                    │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start   │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ start   │ -p functional-428744 --alsologtostderr -v=8                                                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:26 UTC │                     │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:latest                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add minikube-local-cache-test:functional-428744                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache delete minikube-local-cache-test:functional-428744                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl images                                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ cache   │ functional-428744 cache reload                                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ kubectl │ functional-428744 kubectl -- --context functional-428744 get pods                                                                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ start   │ -p functional-428744 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:32:28
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:32:28.671063   54581 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:32:28.671177   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671181   54581 out.go:374] Setting ErrFile to fd 2...
	I1201 19:32:28.671185   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671462   54581 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:32:28.671791   54581 out.go:368] Setting JSON to false
	I1201 19:32:28.672593   54581 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4500,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:32:28.672645   54581 start.go:143] virtualization:  
	I1201 19:32:28.676118   54581 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:32:28.679062   54581 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:32:28.679153   54581 notify.go:221] Checking for updates...
	I1201 19:32:28.685968   54581 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:32:28.688852   54581 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:32:28.691733   54581 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:32:28.694613   54581 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:32:28.697549   54581 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:32:28.700837   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:28.700934   54581 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:32:28.730800   54581 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:32:28.730894   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.786972   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.776963779 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.787065   54581 docker.go:319] overlay module found
	I1201 19:32:28.789990   54581 out.go:179] * Using the docker driver based on existing profile
	I1201 19:32:28.792702   54581 start.go:309] selected driver: docker
	I1201 19:32:28.792712   54581 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.792814   54581 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:32:28.792926   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.854079   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.841219008 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.854498   54581 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1201 19:32:28.854520   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:28.854580   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:28.854619   54581 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.858061   54581 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:32:28.860972   54581 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:32:28.863997   54581 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:32:28.866788   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:28.866980   54581 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:32:28.895611   54581 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:32:28.895623   54581 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:32:28.922565   54581 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:32:29.117617   54581 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:32:29.117759   54581 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:32:29.117789   54581 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117872   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:32:29.117882   54581 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 108.863µs
	I1201 19:32:29.117888   54581 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:32:29.117898   54581 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117926   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:32:29.117930   54581 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 33.443µs
	I1201 19:32:29.117935   54581 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117944   54581 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117979   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:32:29.117983   54581 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.647µs
	I1201 19:32:29.117988   54581 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117998   54581 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118023   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:32:29.118035   54581 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 30.974µs
	I1201 19:32:29.118040   54581 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118040   54581 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:32:29.118048   54581 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118072   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:32:29.118066   54581 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118075   54581 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 28.709µs
	I1201 19:32:29.118080   54581 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118088   54581 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118102   54581 start.go:364] duration metric: took 25.197µs to acquireMachinesLock for "functional-428744"
	I1201 19:32:29.118113   54581 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:32:29.118114   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:32:29.118117   54581 fix.go:54] fixHost starting: 
	I1201 19:32:29.118118   54581 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.457µs
	I1201 19:32:29.118122   54581 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:32:29.118129   54581 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118152   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:32:29.118156   54581 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 27.199µs
	I1201 19:32:29.118160   54581 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:32:29.118167   54581 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118216   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:32:29.118220   54581 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 53.562µs
	I1201 19:32:29.118229   54581 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:32:29.118236   54581 cache.go:87] Successfully saved all images to host disk.
	I1201 19:32:29.118392   54581 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:32:29.135509   54581 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:32:29.135543   54581 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:32:29.140504   54581 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:32:29.140530   54581 machine.go:94] provisionDockerMachine start ...
	I1201 19:32:29.140609   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.157677   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.157997   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.158004   54581 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:32:29.305012   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.305026   54581 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:32:29.305098   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.323134   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.323429   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.323437   54581 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:32:29.478458   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.478532   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.497049   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.498161   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.498184   54581 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:32:29.645663   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:32:29.645679   54581 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:32:29.645696   54581 ubuntu.go:190] setting up certificates
	I1201 19:32:29.645703   54581 provision.go:84] configureAuth start
	I1201 19:32:29.645772   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:29.663161   54581 provision.go:143] copyHostCerts
	I1201 19:32:29.663227   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:32:29.663233   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:32:29.663306   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:32:29.663413   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:32:29.663416   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:32:29.663441   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:32:29.663488   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:32:29.663496   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:32:29.663517   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:32:29.663560   54581 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:32:29.922590   54581 provision.go:177] copyRemoteCerts
	I1201 19:32:29.922645   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:32:29.922682   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.944750   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.066257   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:32:30.114189   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:32:30.139869   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:32:30.162018   54581 provision.go:87] duration metric: took 516.289617ms to configureAuth
	I1201 19:32:30.162044   54581 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:32:30.162294   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:30.162301   54581 machine.go:97] duration metric: took 1.021765793s to provisionDockerMachine
	I1201 19:32:30.162308   54581 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:32:30.162319   54581 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:32:30.162368   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:32:30.162422   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.181979   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.285977   54581 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:32:30.289531   54581 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:32:30.289549   54581 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:32:30.289559   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:32:30.289616   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:32:30.289694   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:32:30.289767   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:32:30.289821   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:32:30.297763   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:30.315893   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:32:30.335096   54581 start.go:296] duration metric: took 172.774471ms for postStartSetup
	I1201 19:32:30.335168   54581 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:32:30.335214   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.355398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.458545   54581 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:32:30.463103   54581 fix.go:56] duration metric: took 1.344978374s for fixHost
	I1201 19:32:30.463118   54581 start.go:83] releasing machines lock for "functional-428744", held for 1.345010357s
	I1201 19:32:30.463185   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:30.480039   54581 ssh_runner.go:195] Run: cat /version.json
	I1201 19:32:30.480081   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.480337   54581 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:32:30.480395   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.499221   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.501398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.601341   54581 ssh_runner.go:195] Run: systemctl --version
	I1201 19:32:30.695138   54581 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 19:32:30.699523   54581 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:32:30.699612   54581 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:32:30.707379   54581 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:32:30.707392   54581 start.go:496] detecting cgroup driver to use...
	I1201 19:32:30.707423   54581 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:32:30.707469   54581 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:32:30.722782   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:32:30.736023   54581 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:32:30.736084   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:32:30.751857   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:32:30.765106   54581 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:32:30.881005   54581 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:32:31.019194   54581 docker.go:234] disabling docker service ...
	I1201 19:32:31.019259   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:32:31.037044   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:32:31.052926   54581 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:32:31.181456   54581 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:32:31.340481   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:32:31.355001   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:32:31.370840   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:32:31.380231   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:32:31.389693   54581 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:32:31.389764   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:32:31.399360   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.408437   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:32:31.417370   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.426455   54581 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:32:31.434636   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:32:31.443735   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:32:31.453324   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:32:31.462516   54581 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:32:31.470270   54581 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:32:31.478172   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:31.592137   54581 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:32:31.712107   54581 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:32:31.712186   54581 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:32:31.715994   54581 start.go:564] Will wait 60s for crictl version
	I1201 19:32:31.716056   54581 ssh_runner.go:195] Run: which crictl
	I1201 19:32:31.719610   54581 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:32:31.745073   54581 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:32:31.745152   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.765358   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.791628   54581 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:32:31.794721   54581 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:32:31.811133   54581 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:32:31.818179   54581 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1201 19:32:31.821064   54581 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:32:31.821193   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:31.821269   54581 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:32:31.856356   54581 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:32:31.856368   54581 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:32:31.856374   54581 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:32:31.856475   54581 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:32:31.856536   54581 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:32:31.895308   54581 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1201 19:32:31.895325   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:31.895333   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:31.895346   54581 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:32:31.895366   54581 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:32:31.895478   54581 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:32:31.895541   54581 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:32:31.905339   54581 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:32:31.905406   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:32:31.913323   54581 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:32:31.927846   54581 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:32:31.940396   54581 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1201 19:32:31.953139   54581 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:32:31.956806   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:32.073166   54581 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:32:32.587407   54581 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:32:32.587419   54581 certs.go:195] generating shared ca certs ...
	I1201 19:32:32.587436   54581 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:32:32.587628   54581 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:32:32.587672   54581 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:32:32.587679   54581 certs.go:257] generating profile certs ...
	I1201 19:32:32.587796   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:32:32.587858   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:32:32.587895   54581 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:32:32.588027   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:32:32.588060   54581 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:32:32.588067   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:32:32.588104   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:32:32.588128   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:32:32.588158   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:32:32.588202   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:32.589935   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:32:32.611510   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:32:32.631449   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:32:32.652864   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:32:32.672439   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:32:32.690857   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:32:32.709160   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:32:32.727076   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:32:32.745055   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:32:32.762625   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:32:32.780355   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:32:32.797626   54581 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:32:32.810250   54581 ssh_runner.go:195] Run: openssl version
	I1201 19:32:32.816425   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:32:32.825294   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829094   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829148   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.869893   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:32:32.877720   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:32:32.886198   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889911   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889967   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.930479   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:32:32.938463   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:32:32.946940   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950621   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950676   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.991499   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:32:32.999452   54581 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:32:33.003313   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:32:33.045305   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:32:33.087269   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:32:33.128376   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:32:33.169796   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:32:33.211259   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:32:33.257335   54581 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:33.257412   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:32:33.257501   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.284260   54581 cri.go:89] found id: ""
	I1201 19:32:33.284320   54581 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:32:33.292458   54581 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:32:33.292468   54581 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:32:33.292518   54581 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:32:33.300158   54581 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.300668   54581 kubeconfig.go:125] found "functional-428744" server: "https://192.168.49.2:8441"
	I1201 19:32:33.301960   54581 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:32:33.310120   54581 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-01 19:17:59.066738599 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-01 19:32:31.946987775 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1201 19:32:33.310138   54581 kubeadm.go:1161] stopping kube-system containers ...
	I1201 19:32:33.310149   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1201 19:32:33.310213   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.338492   54581 cri.go:89] found id: ""
	I1201 19:32:33.338551   54581 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1201 19:32:33.356342   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:32:33.364607   54581 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec  1 19:22 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec  1 19:22 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5676 Dec  1 19:22 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5588 Dec  1 19:22 /etc/kubernetes/scheduler.conf
	
	I1201 19:32:33.364669   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:32:33.372608   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:32:33.380647   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.380700   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:32:33.388464   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.397123   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.397189   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.404816   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:32:33.412562   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.412628   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:32:33.420390   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:32:33.428330   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:33.477124   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.484075   54581 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.006926734s)
	I1201 19:32:34.484135   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.694382   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.769616   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.812433   54581 api_server.go:52] waiting for apiserver process to appear ...
	I1201 19:32:34.812505   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.313033   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.812993   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.312704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.813245   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.313300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.812687   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.312636   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.813205   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.813572   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.312587   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.812696   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.313535   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.813472   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.813224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.313067   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.813328   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.312678   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.813484   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.312731   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.812683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.313429   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.813026   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.312606   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.812689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.813689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.313474   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.312618   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.813410   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.313371   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.812979   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.312792   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.812691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.313042   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.813445   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.313212   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.812741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.312722   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.812580   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.313621   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.813459   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.313224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.812880   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.313609   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.313283   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.812739   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.313558   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.813248   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.313098   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.813623   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.313600   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.813357   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.312559   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.812827   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.312653   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.812616   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.313447   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.813117   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.312712   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.812713   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.314198   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.313642   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.813457   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.313464   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.812697   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.312626   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.813299   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.813267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.312931   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.812887   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.312894   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.813197   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.312689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.812595   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.313557   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.313428   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.813327   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.313520   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.812744   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.313564   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.812611   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.313634   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.813393   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.313426   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.812688   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.313372   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.812638   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.313360   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.812897   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.313015   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.813101   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.312709   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.812907   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.312644   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.812569   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.313009   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.813448   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.312851   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.813268   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.313602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.312692   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.813538   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.313307   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.813008   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.313397   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.313454   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.813423   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.313344   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.813145   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.312690   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.813369   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:34.813443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:34.847624   54581 cri.go:89] found id: ""
	I1201 19:33:34.847638   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.847645   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:34.847650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:34.847707   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:34.877781   54581 cri.go:89] found id: ""
	I1201 19:33:34.877795   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.877802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:34.877807   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:34.877865   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:34.906556   54581 cri.go:89] found id: ""
	I1201 19:33:34.906569   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.906575   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:34.906581   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:34.906638   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:34.932243   54581 cri.go:89] found id: ""
	I1201 19:33:34.932257   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.932264   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:34.932275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:34.932334   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:34.958307   54581 cri.go:89] found id: ""
	I1201 19:33:34.958320   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.958327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:34.958333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:34.958393   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:34.987839   54581 cri.go:89] found id: ""
	I1201 19:33:34.987852   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.987860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:34.987865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:34.987924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:35.013339   54581 cri.go:89] found id: ""
	I1201 19:33:35.013353   54581 logs.go:282] 0 containers: []
	W1201 19:33:35.013360   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:35.013367   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:35.013377   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:35.024284   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:35.024300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:35.102562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:35.102584   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:35.102595   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:35.168823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:35.168843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:35.200459   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:35.200475   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:37.759267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:37.769446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:37.769528   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:37.794441   54581 cri.go:89] found id: ""
	I1201 19:33:37.794454   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.794461   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:37.794467   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:37.794522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:37.825029   54581 cri.go:89] found id: ""
	I1201 19:33:37.825042   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.825049   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:37.825059   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:37.825116   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:37.855847   54581 cri.go:89] found id: ""
	I1201 19:33:37.855860   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.855867   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:37.855872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:37.855932   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:37.892812   54581 cri.go:89] found id: ""
	I1201 19:33:37.892826   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.892833   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:37.892839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:37.892902   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:37.923175   54581 cri.go:89] found id: ""
	I1201 19:33:37.923189   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.923195   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:37.923201   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:37.923260   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:37.956838   54581 cri.go:89] found id: ""
	I1201 19:33:37.956852   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.956858   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:37.956864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:37.956921   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:37.983288   54581 cri.go:89] found id: ""
	I1201 19:33:37.983302   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.983309   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:37.983317   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:37.983328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:38.048803   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:38.048828   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:38.048842   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:38.114525   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:38.114549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:38.144040   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:38.144056   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:38.203160   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:38.203178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:40.714632   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:40.724993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:40.725058   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:40.749954   54581 cri.go:89] found id: ""
	I1201 19:33:40.749968   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.749975   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:40.749981   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:40.750040   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:40.775337   54581 cri.go:89] found id: ""
	I1201 19:33:40.775350   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.775357   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:40.775362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:40.775425   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:40.801568   54581 cri.go:89] found id: ""
	I1201 19:33:40.801582   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.801590   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:40.801595   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:40.801663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:40.829766   54581 cri.go:89] found id: ""
	I1201 19:33:40.829779   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.829786   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:40.829791   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:40.829850   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:40.864362   54581 cri.go:89] found id: ""
	I1201 19:33:40.864376   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.864383   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:40.864389   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:40.864447   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:40.893407   54581 cri.go:89] found id: ""
	I1201 19:33:40.893419   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.893427   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:40.893433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:40.893507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:40.919149   54581 cri.go:89] found id: ""
	I1201 19:33:40.919163   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.919172   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:40.919179   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:40.919189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:40.949474   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:40.949572   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:41.005421   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:41.005440   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:41.016259   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:41.016274   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:41.078378   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:41.078391   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:41.078401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:43.641960   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:43.652106   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:43.652178   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:43.682005   54581 cri.go:89] found id: ""
	I1201 19:33:43.682018   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.682025   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:43.682030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:43.682087   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:43.707580   54581 cri.go:89] found id: ""
	I1201 19:33:43.707593   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.707600   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:43.707606   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:43.707711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:43.732400   54581 cri.go:89] found id: ""
	I1201 19:33:43.732414   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.732421   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:43.732426   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:43.732483   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:43.758218   54581 cri.go:89] found id: ""
	I1201 19:33:43.758232   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.758239   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:43.758245   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:43.758303   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:43.783139   54581 cri.go:89] found id: ""
	I1201 19:33:43.783152   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.783159   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:43.783164   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:43.783227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:43.813453   54581 cri.go:89] found id: ""
	I1201 19:33:43.813467   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.813474   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:43.813480   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:43.813548   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:43.845612   54581 cri.go:89] found id: ""
	I1201 19:33:43.845625   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.845632   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:43.845639   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:43.845649   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:43.909426   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:43.909445   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:43.920543   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:43.920560   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:43.988764   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:43.988776   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:43.988797   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:44.051182   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:44.051208   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:46.583925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:46.594468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:46.594554   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:46.620265   54581 cri.go:89] found id: ""
	I1201 19:33:46.620279   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.620286   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:46.620292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:46.620351   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:46.644633   54581 cri.go:89] found id: ""
	I1201 19:33:46.644652   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.644659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:46.644665   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:46.644721   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:46.669867   54581 cri.go:89] found id: ""
	I1201 19:33:46.669881   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.669888   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:46.669893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:46.669948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:46.694417   54581 cri.go:89] found id: ""
	I1201 19:33:46.694431   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.694438   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:46.694454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:46.694512   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:46.721029   54581 cri.go:89] found id: ""
	I1201 19:33:46.721043   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.721051   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:46.721056   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:46.721114   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:46.747445   54581 cri.go:89] found id: ""
	I1201 19:33:46.747459   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.747466   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:46.747471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:46.747525   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:46.771251   54581 cri.go:89] found id: ""
	I1201 19:33:46.771266   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.771272   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:46.771281   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:46.771290   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:46.829699   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:46.829716   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:46.842077   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:46.842096   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:46.924213   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:46.924225   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:46.924235   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:46.990853   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:46.990872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:49.521683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:49.531974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:49.532042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:49.557473   54581 cri.go:89] found id: ""
	I1201 19:33:49.557514   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.557521   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:49.557527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:49.557640   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:49.583182   54581 cri.go:89] found id: ""
	I1201 19:33:49.583229   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.583237   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:49.583242   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:49.583308   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:49.611533   54581 cri.go:89] found id: ""
	I1201 19:33:49.611546   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.611553   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:49.611559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:49.611615   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:49.637433   54581 cri.go:89] found id: ""
	I1201 19:33:49.637446   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.637460   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:49.637466   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:49.637558   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:49.667274   54581 cri.go:89] found id: ""
	I1201 19:33:49.667287   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.667294   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:49.667299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:49.667358   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:49.696772   54581 cri.go:89] found id: ""
	I1201 19:33:49.696790   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.696797   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:49.696803   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:49.696861   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:49.720607   54581 cri.go:89] found id: ""
	I1201 19:33:49.720621   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.720637   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:49.720645   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:49.720655   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:49.776412   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:49.776431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:49.787417   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:49.787432   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:49.862636   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:49.862647   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:49.862658   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:49.934395   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:49.934421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:52.463339   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:52.473586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:52.473650   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:52.503529   54581 cri.go:89] found id: ""
	I1201 19:33:52.503542   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.503549   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:52.503555   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:52.503618   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:52.531144   54581 cri.go:89] found id: ""
	I1201 19:33:52.531158   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.531165   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:52.531170   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:52.531228   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:52.556664   54581 cri.go:89] found id: ""
	I1201 19:33:52.556678   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.556685   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:52.556691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:52.556753   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:52.583782   54581 cri.go:89] found id: ""
	I1201 19:33:52.583796   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.583802   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:52.583808   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:52.583866   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:52.608468   54581 cri.go:89] found id: ""
	I1201 19:33:52.608481   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.608488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:52.608494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:52.608553   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:52.632068   54581 cri.go:89] found id: ""
	I1201 19:33:52.632081   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.632088   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:52.632093   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:52.632153   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:52.656905   54581 cri.go:89] found id: ""
	I1201 19:33:52.656919   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.656926   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:52.656934   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:52.656944   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:52.715322   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:52.715340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:52.725941   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:52.725956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:52.787814   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:52.787824   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:52.787835   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:52.857124   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:52.857146   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:55.384601   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:55.394657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:55.394724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:55.419003   54581 cri.go:89] found id: ""
	I1201 19:33:55.419016   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.419023   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:55.419028   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:55.419093   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:55.444043   54581 cri.go:89] found id: ""
	I1201 19:33:55.444057   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.444064   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:55.444069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:55.444126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:55.469199   54581 cri.go:89] found id: ""
	I1201 19:33:55.469212   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.469219   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:55.469224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:55.469284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:55.494106   54581 cri.go:89] found id: ""
	I1201 19:33:55.494123   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.494130   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:55.494135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:55.494192   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:55.523658   54581 cri.go:89] found id: ""
	I1201 19:33:55.523671   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.523678   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:55.523683   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:55.523742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:55.549084   54581 cri.go:89] found id: ""
	I1201 19:33:55.549097   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.549105   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:55.549110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:55.549171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:55.573973   54581 cri.go:89] found id: ""
	I1201 19:33:55.573986   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.573993   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:55.574001   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:55.574014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:55.629601   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:55.629618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:55.640511   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:55.640527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:55.703852   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:55.703862   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:55.703875   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:55.767135   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:55.767154   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.297608   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:58.307660   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:58.307729   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:58.331936   54581 cri.go:89] found id: ""
	I1201 19:33:58.331948   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.331955   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:58.331961   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:58.332023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:58.356515   54581 cri.go:89] found id: ""
	I1201 19:33:58.356528   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.356535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:58.356544   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:58.356601   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:58.381178   54581 cri.go:89] found id: ""
	I1201 19:33:58.381191   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.381198   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:58.381203   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:58.381259   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:58.405890   54581 cri.go:89] found id: ""
	I1201 19:33:58.405904   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.405911   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:58.405916   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:58.405971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:58.429783   54581 cri.go:89] found id: ""
	I1201 19:33:58.429796   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.429804   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:58.429809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:58.429875   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:58.454357   54581 cri.go:89] found id: ""
	I1201 19:33:58.454370   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.454377   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:58.454383   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:58.454443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:58.483382   54581 cri.go:89] found id: ""
	I1201 19:33:58.483395   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.483403   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:58.483410   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:58.483421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:58.494465   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:58.494480   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:58.557097   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:58.557108   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:58.557119   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:58.624200   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:58.624219   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.654678   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:58.654694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.213704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:01.225298   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:01.225360   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:01.251173   54581 cri.go:89] found id: ""
	I1201 19:34:01.251187   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.251194   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:01.251200   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:01.251272   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:01.278884   54581 cri.go:89] found id: ""
	I1201 19:34:01.278897   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.278904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:01.278910   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:01.278967   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:01.305393   54581 cri.go:89] found id: ""
	I1201 19:34:01.305407   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.305414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:01.305419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:01.305522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:01.331958   54581 cri.go:89] found id: ""
	I1201 19:34:01.331971   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.331978   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:01.331983   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:01.332042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:01.357701   54581 cri.go:89] found id: ""
	I1201 19:34:01.357714   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.357721   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:01.357727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:01.357786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:01.384631   54581 cri.go:89] found id: ""
	I1201 19:34:01.384645   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.384662   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:01.384668   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:01.384742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:01.410554   54581 cri.go:89] found id: ""
	I1201 19:34:01.410567   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.410574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:01.410582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:01.410591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.466596   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:01.466614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:01.477827   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:01.477843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:01.543509   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:01.543518   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:01.543529   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:01.606587   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:01.606608   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:04.136300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:04.146336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:04.146412   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:04.177880   54581 cri.go:89] found id: ""
	I1201 19:34:04.177894   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.177901   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:04.177906   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:04.177971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:04.203986   54581 cri.go:89] found id: ""
	I1201 19:34:04.203999   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.204006   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:04.204012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:04.204068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:04.228899   54581 cri.go:89] found id: ""
	I1201 19:34:04.228912   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.228920   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:04.228925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:04.228989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:04.254700   54581 cri.go:89] found id: ""
	I1201 19:34:04.254715   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.254722   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:04.254729   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:04.254788   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:04.280370   54581 cri.go:89] found id: ""
	I1201 19:34:04.280383   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.280390   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:04.280396   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:04.280453   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:04.304821   54581 cri.go:89] found id: ""
	I1201 19:34:04.304834   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.304842   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:04.304847   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:04.304910   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:04.331513   54581 cri.go:89] found id: ""
	I1201 19:34:04.331525   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.331533   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:04.331540   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:04.331550   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:04.390353   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:04.390371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:04.403182   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:04.403198   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:04.471239   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:04.471261   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:04.471273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:04.534546   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:04.534567   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:07.063925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:07.074362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:07.074427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:07.107919   54581 cri.go:89] found id: ""
	I1201 19:34:07.107933   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.107940   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:07.107946   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:07.108003   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:07.137952   54581 cri.go:89] found id: ""
	I1201 19:34:07.137965   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.137973   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:07.137978   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:07.138038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:07.172024   54581 cri.go:89] found id: ""
	I1201 19:34:07.172037   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.172044   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:07.172049   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:07.172107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:07.196732   54581 cri.go:89] found id: ""
	I1201 19:34:07.196745   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.196752   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:07.196759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:07.196814   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:07.221862   54581 cri.go:89] found id: ""
	I1201 19:34:07.221875   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.221882   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:07.221888   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:07.221947   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:07.249751   54581 cri.go:89] found id: ""
	I1201 19:34:07.249765   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.249771   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:07.249777   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:07.249833   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:07.275027   54581 cri.go:89] found id: ""
	I1201 19:34:07.275040   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.275047   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:07.275055   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:07.275065   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:07.330139   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:07.330156   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:07.341431   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:07.341447   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:07.404752   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:07.404762   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:07.404780   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:07.471227   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:07.471244   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.003255   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:10.013892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:10.013949   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:10.059011   54581 cri.go:89] found id: ""
	I1201 19:34:10.059025   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.059033   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:10.059039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:10.059101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:10.096138   54581 cri.go:89] found id: ""
	I1201 19:34:10.096152   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.096170   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:10.096177   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:10.096282   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:10.138539   54581 cri.go:89] found id: ""
	I1201 19:34:10.138600   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.138612   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:10.138618   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:10.138688   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:10.168476   54581 cri.go:89] found id: ""
	I1201 19:34:10.168490   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.168497   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:10.168502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:10.168580   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:10.194454   54581 cri.go:89] found id: ""
	I1201 19:34:10.194480   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.194487   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:10.194493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:10.194560   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:10.219419   54581 cri.go:89] found id: ""
	I1201 19:34:10.219432   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.219439   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:10.219445   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:10.219507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:10.244925   54581 cri.go:89] found id: ""
	I1201 19:34:10.244938   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.244945   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:10.244953   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:10.244964   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:10.311653   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:10.311663   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:10.311673   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:10.377857   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:10.377877   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.407833   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:10.407851   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:10.467737   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:10.467757   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:12.980376   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:12.990779   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:12.990838   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:13.016106   54581 cri.go:89] found id: ""
	I1201 19:34:13.016120   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.016127   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:13.016133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:13.016198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:13.044361   54581 cri.go:89] found id: ""
	I1201 19:34:13.044375   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.044382   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:13.044387   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:13.044444   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:13.069827   54581 cri.go:89] found id: ""
	I1201 19:34:13.069841   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.069849   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:13.069854   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:13.069913   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:13.110851   54581 cri.go:89] found id: ""
	I1201 19:34:13.110864   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.110871   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:13.110876   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:13.110933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:13.141612   54581 cri.go:89] found id: ""
	I1201 19:34:13.141626   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.141633   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:13.141638   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:13.141695   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:13.168579   54581 cri.go:89] found id: ""
	I1201 19:34:13.168592   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.168599   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:13.168604   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:13.168676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:13.194182   54581 cri.go:89] found id: ""
	I1201 19:34:13.194196   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.194204   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:13.194211   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:13.194221   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:13.255821   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:13.255840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:13.267071   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:13.267087   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:13.336403   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:13.336424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:13.336434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:13.399839   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:13.399859   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:15.930208   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:15.940605   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:15.940671   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:15.966201   54581 cri.go:89] found id: ""
	I1201 19:34:15.966215   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.966223   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:15.966228   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:15.966291   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:15.996515   54581 cri.go:89] found id: ""
	I1201 19:34:15.996528   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.996535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:15.996541   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:15.996598   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:16.022535   54581 cri.go:89] found id: ""
	I1201 19:34:16.022550   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.022564   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:16.022569   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:16.022630   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:16.057222   54581 cri.go:89] found id: ""
	I1201 19:34:16.057236   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.057246   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:16.057252   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:16.057313   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:16.087879   54581 cri.go:89] found id: ""
	I1201 19:34:16.087893   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.087900   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:16.087905   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:16.087965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:16.120946   54581 cri.go:89] found id: ""
	I1201 19:34:16.120960   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.120968   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:16.120974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:16.121035   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:16.154523   54581 cri.go:89] found id: ""
	I1201 19:34:16.154538   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.154544   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:16.154552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:16.154562   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:16.227282   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:16.227292   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:16.227303   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:16.291304   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:16.291323   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:16.320283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:16.320299   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:16.379997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:16.380014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:18.891691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:18.901502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:18.901561   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:18.926115   54581 cri.go:89] found id: ""
	I1201 19:34:18.926128   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.926135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:18.926141   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:18.926212   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:18.951977   54581 cri.go:89] found id: ""
	I1201 19:34:18.951991   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.951998   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:18.952003   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:18.952068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:18.983248   54581 cri.go:89] found id: ""
	I1201 19:34:18.983266   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.983273   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:18.983278   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:18.983342   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:19.010990   54581 cri.go:89] found id: ""
	I1201 19:34:19.011010   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.011018   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:19.011024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:19.011086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:19.036672   54581 cri.go:89] found id: ""
	I1201 19:34:19.036686   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.036693   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:19.036699   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:19.036767   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:19.061847   54581 cri.go:89] found id: ""
	I1201 19:34:19.061861   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.061868   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:19.061873   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:19.061933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:19.095496   54581 cri.go:89] found id: ""
	I1201 19:34:19.095518   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.095525   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:19.095534   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:19.095544   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:19.160188   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:19.160209   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:19.171389   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:19.171411   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:19.237242   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:19.237253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:19.237273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:19.299987   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:19.300005   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:21.834525   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:21.845009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:21.845070   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:21.869831   54581 cri.go:89] found id: ""
	I1201 19:34:21.869848   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.869855   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:21.869863   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:21.869920   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:21.894806   54581 cri.go:89] found id: ""
	I1201 19:34:21.894819   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.894826   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:21.894831   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:21.894888   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:21.919467   54581 cri.go:89] found id: ""
	I1201 19:34:21.919481   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.919489   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:21.919494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:21.919557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:21.947371   54581 cri.go:89] found id: ""
	I1201 19:34:21.947384   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.947392   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:21.947397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:21.947466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:21.972455   54581 cri.go:89] found id: ""
	I1201 19:34:21.972469   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.972488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:21.972493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:21.972551   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:21.998955   54581 cri.go:89] found id: ""
	I1201 19:34:21.998969   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.998977   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:21.998982   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:21.999044   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:22.030320   54581 cri.go:89] found id: ""
	I1201 19:34:22.030348   54581 logs.go:282] 0 containers: []
	W1201 19:34:22.030356   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:22.030365   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:22.030378   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:22.091531   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:22.091549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:22.107258   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:22.107285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:22.185420   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:22.185431   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:22.185442   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:22.250849   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:22.250866   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:24.779249   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:24.792463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:24.792522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:24.817350   54581 cri.go:89] found id: ""
	I1201 19:34:24.817364   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.817371   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:24.817377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:24.817434   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:24.842191   54581 cri.go:89] found id: ""
	I1201 19:34:24.842205   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.842218   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:24.842224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:24.842284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:24.867478   54581 cri.go:89] found id: ""
	I1201 19:34:24.867492   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.867499   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:24.867505   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:24.867576   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:24.899422   54581 cri.go:89] found id: ""
	I1201 19:34:24.899436   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.899443   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:24.899452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:24.899509   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:24.934866   54581 cri.go:89] found id: ""
	I1201 19:34:24.934880   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.934887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:24.934893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:24.934956   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:24.959270   54581 cri.go:89] found id: ""
	I1201 19:34:24.959284   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.959291   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:24.959297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:24.959362   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:24.984211   54581 cri.go:89] found id: ""
	I1201 19:34:24.984224   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.984231   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:24.984239   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:24.984259   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:25.012471   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:25.012487   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:25.072643   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:25.072660   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:25.083552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:25.083571   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:25.160495   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:25.160504   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:25.160516   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:27.727176   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:27.737246   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:27.737307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:27.761343   54581 cri.go:89] found id: ""
	I1201 19:34:27.761357   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.761364   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:27.761370   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:27.761428   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:27.786257   54581 cri.go:89] found id: ""
	I1201 19:34:27.786276   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.786283   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:27.786288   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:27.786344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:27.810779   54581 cri.go:89] found id: ""
	I1201 19:34:27.810798   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.810807   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:27.810812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:27.810874   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:27.834773   54581 cri.go:89] found id: ""
	I1201 19:34:27.834792   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.834799   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:27.834804   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:27.834860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:27.862223   54581 cri.go:89] found id: ""
	I1201 19:34:27.862241   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.862248   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:27.862253   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:27.862307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:27.887279   54581 cri.go:89] found id: ""
	I1201 19:34:27.887292   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.887299   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:27.887305   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:27.887361   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:27.910821   54581 cri.go:89] found id: ""
	I1201 19:34:27.910834   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.910842   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:27.910849   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:27.910872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:27.920894   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:27.920909   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:27.982787   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:27.982797   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:27.982808   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:28.049448   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:28.049466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:28.083298   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:28.083315   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:30.648755   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:30.659054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:30.659115   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:30.683776   54581 cri.go:89] found id: ""
	I1201 19:34:30.683790   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.683797   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:30.683802   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:30.683858   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:30.708715   54581 cri.go:89] found id: ""
	I1201 19:34:30.708729   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.708736   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:30.708741   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:30.708801   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:30.732741   54581 cri.go:89] found id: ""
	I1201 19:34:30.732754   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.732761   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:30.732767   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:30.732821   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:30.762264   54581 cri.go:89] found id: ""
	I1201 19:34:30.762278   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.762284   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:30.762290   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:30.762353   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:30.789298   54581 cri.go:89] found id: ""
	I1201 19:34:30.789312   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.789319   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:30.789324   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:30.789381   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:30.814068   54581 cri.go:89] found id: ""
	I1201 19:34:30.814081   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.814089   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:30.814095   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:30.814157   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:30.841381   54581 cri.go:89] found id: ""
	I1201 19:34:30.841394   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.841402   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:30.841409   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:30.841431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:30.902920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:30.902931   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:30.902943   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:30.965009   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:30.965026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:30.993347   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:30.993370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:31.049258   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:31.049275   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.560996   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:33.571497   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:33.571557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:33.596875   54581 cri.go:89] found id: ""
	I1201 19:34:33.596889   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.596896   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:33.596901   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:33.596960   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:33.623640   54581 cri.go:89] found id: ""
	I1201 19:34:33.623653   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.623659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:33.623664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:33.623725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:33.647792   54581 cri.go:89] found id: ""
	I1201 19:34:33.647806   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.647814   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:33.647819   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:33.647882   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:33.672114   54581 cri.go:89] found id: ""
	I1201 19:34:33.672127   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.672134   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:33.672139   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:33.672197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:33.704799   54581 cri.go:89] found id: ""
	I1201 19:34:33.704812   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.704820   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:33.704825   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:33.704885   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:33.728981   54581 cri.go:89] found id: ""
	I1201 19:34:33.728995   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.729001   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:33.729006   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:33.729063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:33.756005   54581 cri.go:89] found id: ""
	I1201 19:34:33.756019   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.756027   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:33.756035   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:33.756046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:33.788420   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:33.788437   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:33.848036   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:33.848054   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.858909   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:33.858925   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:33.921156   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:33.921167   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:33.921178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.484434   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:36.494616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:36.494679   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:36.520017   54581 cri.go:89] found id: ""
	I1201 19:34:36.520031   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.520038   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:36.520044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:36.520100   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:36.545876   54581 cri.go:89] found id: ""
	I1201 19:34:36.545890   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.545897   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:36.545903   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:36.545966   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:36.571571   54581 cri.go:89] found id: ""
	I1201 19:34:36.571584   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.571591   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:36.571596   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:36.571653   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:36.596997   54581 cri.go:89] found id: ""
	I1201 19:34:36.597012   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.597019   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:36.597024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:36.597101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:36.623469   54581 cri.go:89] found id: ""
	I1201 19:34:36.623483   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.623491   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:36.623496   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:36.623556   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:36.651811   54581 cri.go:89] found id: ""
	I1201 19:34:36.651824   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.651831   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:36.651837   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:36.651893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:36.676659   54581 cri.go:89] found id: ""
	I1201 19:34:36.676673   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.676680   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:36.676688   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:36.676697   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:36.732392   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:36.732410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:36.743384   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:36.743400   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:36.805329   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:36.805338   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:36.805349   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.867566   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:36.867584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:39.402157   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:39.412161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:39.412220   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:39.439366   54581 cri.go:89] found id: ""
	I1201 19:34:39.439380   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.439387   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:39.439392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:39.439451   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:39.464076   54581 cri.go:89] found id: ""
	I1201 19:34:39.464090   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.464097   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:39.464108   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:39.464171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:39.488248   54581 cri.go:89] found id: ""
	I1201 19:34:39.488262   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.488270   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:39.488275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:39.488331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:39.517302   54581 cri.go:89] found id: ""
	I1201 19:34:39.517315   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.517322   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:39.517328   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:39.517385   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:39.542966   54581 cri.go:89] found id: ""
	I1201 19:34:39.542980   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.542986   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:39.542992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:39.543051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:39.568903   54581 cri.go:89] found id: ""
	I1201 19:34:39.568917   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.568924   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:39.568929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:39.568990   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:39.594057   54581 cri.go:89] found id: ""
	I1201 19:34:39.594069   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.594076   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:39.594084   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:39.594093   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:39.649679   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:39.649698   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:39.660114   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:39.660133   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:39.725472   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:39.725500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:39.725512   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:39.793738   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:39.793756   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:42.322742   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:42.333451   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:42.333536   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:42.368118   54581 cri.go:89] found id: ""
	I1201 19:34:42.368132   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.368139   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:42.368146   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:42.368217   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:42.402172   54581 cri.go:89] found id: ""
	I1201 19:34:42.402186   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.402193   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:42.402198   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:42.402266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:42.426759   54581 cri.go:89] found id: ""
	I1201 19:34:42.426772   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.426780   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:42.426785   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:42.426842   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:42.466084   54581 cri.go:89] found id: ""
	I1201 19:34:42.466097   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.466105   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:42.466110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:42.466168   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:42.490814   54581 cri.go:89] found id: ""
	I1201 19:34:42.490828   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.490835   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:42.490841   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:42.490899   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:42.516557   54581 cri.go:89] found id: ""
	I1201 19:34:42.516570   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.516578   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:42.516583   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:42.516651   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:42.542203   54581 cri.go:89] found id: ""
	I1201 19:34:42.542218   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.542224   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:42.542233   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:42.542243   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:42.599254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:42.599272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:42.610313   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:42.610328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:42.677502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:42.677514   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:42.677527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:42.751656   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:42.751683   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.281764   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:45.295929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:45.296004   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:45.355982   54581 cri.go:89] found id: ""
	I1201 19:34:45.356019   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.356027   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:45.356043   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:45.356214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:45.395974   54581 cri.go:89] found id: ""
	I1201 19:34:45.395987   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.396003   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:45.396008   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:45.396064   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:45.425011   54581 cri.go:89] found id: ""
	I1201 19:34:45.425027   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.425035   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:45.425041   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:45.425175   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:45.450304   54581 cri.go:89] found id: ""
	I1201 19:34:45.450317   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.450325   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:45.450330   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:45.450399   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:45.480282   54581 cri.go:89] found id: ""
	I1201 19:34:45.480296   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.480302   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:45.480307   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:45.480376   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:45.511012   54581 cri.go:89] found id: ""
	I1201 19:34:45.511026   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.511033   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:45.511039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:45.511101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:45.536767   54581 cri.go:89] found id: ""
	I1201 19:34:45.536781   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.536797   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:45.536806   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:45.536818   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:45.547801   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:45.547822   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:45.615408   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:45.615424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:45.615434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:45.679022   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:45.679041   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.711030   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:45.711049   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.268349   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:48.279339   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:48.279398   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:48.304817   54581 cri.go:89] found id: ""
	I1201 19:34:48.304831   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.304839   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:48.304844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:48.304905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:48.329897   54581 cri.go:89] found id: ""
	I1201 19:34:48.329911   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.329919   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:48.329924   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:48.329982   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:48.369087   54581 cri.go:89] found id: ""
	I1201 19:34:48.369100   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.369107   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:48.369112   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:48.369169   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:48.400882   54581 cri.go:89] found id: ""
	I1201 19:34:48.400896   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.400903   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:48.400909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:48.400965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:48.426896   54581 cri.go:89] found id: ""
	I1201 19:34:48.426912   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.426920   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:48.426925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:48.426987   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:48.455956   54581 cri.go:89] found id: ""
	I1201 19:34:48.455969   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.455987   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:48.455994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:48.456051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:48.480640   54581 cri.go:89] found id: ""
	I1201 19:34:48.480653   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.480671   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:48.480679   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:48.480690   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.536591   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:48.536609   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:48.547466   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:48.547482   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:48.620325   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:48.620335   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:48.620345   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:48.683388   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:48.683407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:51.214144   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:51.224292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:51.224364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:51.247923   54581 cri.go:89] found id: ""
	I1201 19:34:51.247937   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.247945   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:51.247952   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:51.248011   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:51.273984   54581 cri.go:89] found id: ""
	I1201 19:34:51.273998   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.274005   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:51.274011   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:51.274072   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:51.298775   54581 cri.go:89] found id: ""
	I1201 19:34:51.298789   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.298796   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:51.298801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:51.298860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:51.326553   54581 cri.go:89] found id: ""
	I1201 19:34:51.326567   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.326574   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:51.326580   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:51.326639   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:51.360945   54581 cri.go:89] found id: ""
	I1201 19:34:51.360959   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.360987   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:51.360992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:51.361059   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:51.396255   54581 cri.go:89] found id: ""
	I1201 19:34:51.396282   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.396290   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:51.396296   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:51.396369   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:51.427687   54581 cri.go:89] found id: ""
	I1201 19:34:51.427700   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.427707   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:51.427715   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:51.427734   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:51.483915   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:51.483934   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:51.495247   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:51.495271   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:51.559547   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:51.559558   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:51.559568   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:51.623141   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:51.623161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:54.157001   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:54.170439   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:54.170498   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:54.203772   54581 cri.go:89] found id: ""
	I1201 19:34:54.203785   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.203792   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:54.203798   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:54.203854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:54.231733   54581 cri.go:89] found id: ""
	I1201 19:34:54.231747   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.231754   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:54.231759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:54.231817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:54.256716   54581 cri.go:89] found id: ""
	I1201 19:34:54.256739   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.256746   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:54.256752   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:54.256817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:54.281376   54581 cri.go:89] found id: ""
	I1201 19:34:54.281390   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.281407   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:54.281413   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:54.281469   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:54.305969   54581 cri.go:89] found id: ""
	I1201 19:34:54.305982   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.305989   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:54.305994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:54.306049   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:54.330385   54581 cri.go:89] found id: ""
	I1201 19:34:54.330399   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.330406   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:54.330422   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:54.330478   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:54.358455   54581 cri.go:89] found id: ""
	I1201 19:34:54.358478   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.358489   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:54.358497   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:54.358508   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:54.422783   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:54.422804   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:54.434139   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:54.434153   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:54.499665   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:54.499677   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:54.499689   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:54.562594   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:54.562614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:57.093944   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:57.104140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:57.104207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:57.129578   54581 cri.go:89] found id: ""
	I1201 19:34:57.129590   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.129597   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:57.129603   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:57.129663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:57.153119   54581 cri.go:89] found id: ""
	I1201 19:34:57.153133   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.153140   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:57.153145   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:57.153202   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:57.178134   54581 cri.go:89] found id: ""
	I1201 19:34:57.178148   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.178155   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:57.178161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:57.178222   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:57.208559   54581 cri.go:89] found id: ""
	I1201 19:34:57.208572   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.208579   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:57.208585   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:57.208642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:57.232807   54581 cri.go:89] found id: ""
	I1201 19:34:57.232821   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.232838   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:57.232844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:57.232898   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:57.257939   54581 cri.go:89] found id: ""
	I1201 19:34:57.257952   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.257959   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:57.257964   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:57.258022   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:57.283855   54581 cri.go:89] found id: ""
	I1201 19:34:57.283869   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.283875   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:57.283883   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:57.283893   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:57.340764   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:57.340781   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:57.352935   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:57.352949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:57.427562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:57.427571   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:57.427581   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:57.490526   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:57.490553   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.020694   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:00.036199   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:00.036266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:00.146207   54581 cri.go:89] found id: ""
	I1201 19:35:00.146226   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.146234   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:00.146241   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:00.146319   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:00.271439   54581 cri.go:89] found id: ""
	I1201 19:35:00.271454   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.271462   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:00.271468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:00.271541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:00.365096   54581 cri.go:89] found id: ""
	I1201 19:35:00.365111   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.365119   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:00.365124   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:00.365190   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:00.419095   54581 cri.go:89] found id: ""
	I1201 19:35:00.419109   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.419116   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:00.419123   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:00.419184   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:00.457455   54581 cri.go:89] found id: ""
	I1201 19:35:00.457470   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.457478   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:00.457507   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:00.457577   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:00.503679   54581 cri.go:89] found id: ""
	I1201 19:35:00.503694   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.503701   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:00.503710   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:00.503803   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:00.546120   54581 cri.go:89] found id: ""
	I1201 19:35:00.546135   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.546142   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:00.546151   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:00.546164   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:00.559836   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:00.559853   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:00.634650   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:00.634660   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:00.634675   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:00.700259   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:00.700278   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.733345   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:00.733363   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.295407   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:03.305664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:03.305725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:03.330370   54581 cri.go:89] found id: ""
	I1201 19:35:03.330385   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.330392   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:03.330397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:03.330452   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:03.356109   54581 cri.go:89] found id: ""
	I1201 19:35:03.356123   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.356130   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:03.356135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:03.356198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:03.382338   54581 cri.go:89] found id: ""
	I1201 19:35:03.382352   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.382360   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:03.382366   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:03.382423   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:03.414550   54581 cri.go:89] found id: ""
	I1201 19:35:03.414564   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.414571   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:03.414577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:03.414633   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:03.438540   54581 cri.go:89] found id: ""
	I1201 19:35:03.438553   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.438560   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:03.438565   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:03.438623   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:03.463113   54581 cri.go:89] found id: ""
	I1201 19:35:03.463127   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.463134   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:03.463140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:03.463204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:03.487632   54581 cri.go:89] found id: ""
	I1201 19:35:03.487645   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.487653   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:03.487660   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:03.487670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.544515   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:03.544536   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:03.555787   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:03.555803   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:03.627256   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:03.627266   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:03.627276   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:03.691235   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:03.691254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.220125   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:06.230749   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:06.230813   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:06.255951   54581 cri.go:89] found id: ""
	I1201 19:35:06.255965   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.255972   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:06.255977   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:06.256034   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:06.281528   54581 cri.go:89] found id: ""
	I1201 19:35:06.281542   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.281549   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:06.281554   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:06.281613   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:06.306502   54581 cri.go:89] found id: ""
	I1201 19:35:06.306515   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.306522   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:06.306527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:06.306590   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:06.337726   54581 cri.go:89] found id: ""
	I1201 19:35:06.337739   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.337745   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:06.337751   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:06.337810   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:06.367682   54581 cri.go:89] found id: ""
	I1201 19:35:06.367696   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.367713   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:06.367726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:06.367793   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:06.397675   54581 cri.go:89] found id: ""
	I1201 19:35:06.397690   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.397707   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:06.397713   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:06.397778   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:06.424426   54581 cri.go:89] found id: ""
	I1201 19:35:06.424439   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.424452   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:06.424460   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:06.424471   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:06.435325   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:06.435340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:06.499920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:06.499942   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:06.499952   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:06.564348   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:06.564367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.592906   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:06.592921   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:09.151061   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:09.161179   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:09.161240   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:09.186739   54581 cri.go:89] found id: ""
	I1201 19:35:09.186752   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.186759   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:09.186765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:09.186822   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:09.211245   54581 cri.go:89] found id: ""
	I1201 19:35:09.211259   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.211267   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:09.211273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:09.211336   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:09.239043   54581 cri.go:89] found id: ""
	I1201 19:35:09.239056   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.239063   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:09.239068   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:09.239125   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:09.264055   54581 cri.go:89] found id: ""
	I1201 19:35:09.264068   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.264076   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:09.264081   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:09.264137   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:09.288509   54581 cri.go:89] found id: ""
	I1201 19:35:09.288522   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.288529   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:09.288536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:09.288593   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:09.312763   54581 cri.go:89] found id: ""
	I1201 19:35:09.312777   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.312784   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:09.312789   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:09.312851   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:09.344164   54581 cri.go:89] found id: ""
	I1201 19:35:09.344177   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.344184   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:09.344192   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:09.344203   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:09.356120   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:09.356134   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:09.428320   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:09.428329   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:09.428339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:09.491282   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:09.491301   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:09.518473   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:09.518488   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.081815   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:12.092336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:12.092400   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:12.117269   54581 cri.go:89] found id: ""
	I1201 19:35:12.117284   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.117291   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:12.117297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:12.117355   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:12.141885   54581 cri.go:89] found id: ""
	I1201 19:35:12.141898   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.141904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:12.141909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:12.141968   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:12.166386   54581 cri.go:89] found id: ""
	I1201 19:35:12.166400   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.166407   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:12.166411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:12.166479   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:12.190615   54581 cri.go:89] found id: ""
	I1201 19:35:12.190628   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.190636   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:12.190641   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:12.190701   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:12.219887   54581 cri.go:89] found id: ""
	I1201 19:35:12.219900   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.219907   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:12.219912   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:12.219970   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:12.244718   54581 cri.go:89] found id: ""
	I1201 19:35:12.244731   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.244738   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:12.244743   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:12.244802   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:12.272273   54581 cri.go:89] found id: ""
	I1201 19:35:12.272287   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.272294   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:12.272301   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:12.272312   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.329315   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:12.329334   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:12.343015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:12.343032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:12.419939   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:12.419949   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:12.419960   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:12.482187   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:12.482205   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:15.011802   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:15.022432   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:15.022499   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:15.094886   54581 cri.go:89] found id: ""
	I1201 19:35:15.094901   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.094909   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:15.094915   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:15.094978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:15.120839   54581 cri.go:89] found id: ""
	I1201 19:35:15.120853   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.120860   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:15.120865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:15.120927   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:15.150767   54581 cri.go:89] found id: ""
	I1201 19:35:15.150781   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.150795   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:15.150801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:15.150867   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:15.177630   54581 cri.go:89] found id: ""
	I1201 19:35:15.177644   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.177651   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:15.177656   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:15.177727   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:15.203467   54581 cri.go:89] found id: ""
	I1201 19:35:15.203480   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.203498   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:15.203504   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:15.203563   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:15.229010   54581 cri.go:89] found id: ""
	I1201 19:35:15.229023   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.229031   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:15.229036   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:15.229128   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:15.254029   54581 cri.go:89] found id: ""
	I1201 19:35:15.254043   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.254051   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:15.254058   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:15.254068   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:15.309931   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:15.309949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:15.320452   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:15.320466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:15.413158   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:15.413169   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:15.413180   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:15.475409   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:15.475428   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:18.004450   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:18.015126   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:18.015185   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:18.046344   54581 cri.go:89] found id: ""
	I1201 19:35:18.046359   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.046366   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:18.046373   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:18.046436   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:18.074519   54581 cri.go:89] found id: ""
	I1201 19:35:18.074532   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.074539   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:18.074545   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:18.074603   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:18.103787   54581 cri.go:89] found id: ""
	I1201 19:35:18.103801   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.103808   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:18.103814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:18.103869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:18.130363   54581 cri.go:89] found id: ""
	I1201 19:35:18.130377   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.130384   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:18.130390   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:18.130449   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:18.155589   54581 cri.go:89] found id: ""
	I1201 19:35:18.155616   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.155625   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:18.155630   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:18.155699   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:18.180628   54581 cri.go:89] found id: ""
	I1201 19:35:18.180641   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.180648   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:18.180654   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:18.180711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:18.205996   54581 cri.go:89] found id: ""
	I1201 19:35:18.206026   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.206033   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:18.206041   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:18.206051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:18.260718   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:18.260736   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:18.271842   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:18.271858   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:18.342769   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:18.342780   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:18.342793   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:18.423726   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:18.423744   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:20.954199   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:20.964087   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:20.964143   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:20.987490   54581 cri.go:89] found id: ""
	I1201 19:35:20.987504   54581 logs.go:282] 0 containers: []
	W1201 19:35:20.987510   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:20.987516   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:20.987572   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:21.012114   54581 cri.go:89] found id: ""
	I1201 19:35:21.012128   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.012135   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:21.012140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:21.012201   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:21.037730   54581 cri.go:89] found id: ""
	I1201 19:35:21.037744   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.037751   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:21.037756   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:21.037815   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:21.062445   54581 cri.go:89] found id: ""
	I1201 19:35:21.062458   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.062465   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:21.062471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:21.062529   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:21.086847   54581 cri.go:89] found id: ""
	I1201 19:35:21.086860   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.086867   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:21.086872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:21.086930   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:21.111866   54581 cri.go:89] found id: ""
	I1201 19:35:21.111880   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.111886   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:21.111892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:21.111948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:21.136296   54581 cri.go:89] found id: ""
	I1201 19:35:21.136311   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.136318   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:21.136326   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:21.136343   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:21.200999   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:21.201009   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:21.201020   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:21.265838   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:21.265857   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:21.296214   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:21.296230   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:21.354254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:21.354272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:23.868647   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:23.879143   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:23.879205   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:23.907613   54581 cri.go:89] found id: ""
	I1201 19:35:23.907633   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.907640   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:23.907645   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:23.907705   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:23.932767   54581 cri.go:89] found id: ""
	I1201 19:35:23.932781   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.932787   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:23.932793   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:23.932849   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:23.961305   54581 cri.go:89] found id: ""
	I1201 19:35:23.961319   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.961326   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:23.961331   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:23.961387   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:23.986651   54581 cri.go:89] found id: ""
	I1201 19:35:23.986664   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.986670   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:23.986676   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:23.986734   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:24.011204   54581 cri.go:89] found id: ""
	I1201 19:35:24.011218   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.011225   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:24.011230   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:24.011286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:24.040784   54581 cri.go:89] found id: ""
	I1201 19:35:24.040798   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.040806   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:24.040812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:24.040871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:24.067432   54581 cri.go:89] found id: ""
	I1201 19:35:24.067446   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.067453   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:24.067461   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:24.067472   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:24.132929   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:24.132946   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:24.132956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:24.194894   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:24.194912   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:24.225351   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:24.225366   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:24.282142   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:24.282161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:26.793143   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:26.803454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:26.803518   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:26.827433   54581 cri.go:89] found id: ""
	I1201 19:35:26.827447   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.827454   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:26.827459   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:26.827514   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:26.851666   54581 cri.go:89] found id: ""
	I1201 19:35:26.851680   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.851686   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:26.851691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:26.851749   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:26.880353   54581 cri.go:89] found id: ""
	I1201 19:35:26.880367   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.880374   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:26.880379   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:26.880437   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:26.908944   54581 cri.go:89] found id: ""
	I1201 19:35:26.908957   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.908964   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:26.908969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:26.909025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:26.933983   54581 cri.go:89] found id: ""
	I1201 19:35:26.933996   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.934003   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:26.934009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:26.934069   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:26.958791   54581 cri.go:89] found id: ""
	I1201 19:35:26.958805   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.958812   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:26.958818   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:26.958878   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:26.983156   54581 cri.go:89] found id: ""
	I1201 19:35:26.983170   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.983177   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:26.983185   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:26.983200   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:27.038997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:27.039015   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:27.050299   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:27.050314   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:27.113733   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:27.113744   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:27.113754   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:27.176267   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:27.176285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.706128   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:29.716285   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:29.716344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:29.741420   54581 cri.go:89] found id: ""
	I1201 19:35:29.741435   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.741442   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:29.741447   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:29.741545   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:29.766524   54581 cri.go:89] found id: ""
	I1201 19:35:29.766538   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.766545   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:29.766550   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:29.766616   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:29.795421   54581 cri.go:89] found id: ""
	I1201 19:35:29.795434   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.795441   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:29.795446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:29.795511   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:29.821121   54581 cri.go:89] found id: ""
	I1201 19:35:29.821135   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.821142   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:29.821147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:29.821204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:29.849641   54581 cri.go:89] found id: ""
	I1201 19:35:29.849654   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.849662   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:29.849667   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:29.849724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:29.874049   54581 cri.go:89] found id: ""
	I1201 19:35:29.874063   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.874069   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:29.874075   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:29.874136   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:29.897867   54581 cri.go:89] found id: ""
	I1201 19:35:29.897880   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.897887   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:29.897895   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:29.897905   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:29.959029   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:29.959046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.991283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:29.991298   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:30.051265   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:30.051286   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:30.082322   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:30.082339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:30.173300   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:32.673672   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:32.683965   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:32.684023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:32.712191   54581 cri.go:89] found id: ""
	I1201 19:35:32.712204   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.712211   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:32.712216   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:32.712275   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:32.739246   54581 cri.go:89] found id: ""
	I1201 19:35:32.739259   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.739266   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:32.739272   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:32.739331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:32.763898   54581 cri.go:89] found id: ""
	I1201 19:35:32.763911   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.763924   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:32.763929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:32.763989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:32.789967   54581 cri.go:89] found id: ""
	I1201 19:35:32.789990   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.789997   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:32.790004   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:32.790063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:32.816013   54581 cri.go:89] found id: ""
	I1201 19:35:32.816028   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.816035   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:32.816040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:32.816098   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:32.839560   54581 cri.go:89] found id: ""
	I1201 19:35:32.839573   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.839580   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:32.839586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:32.839644   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:32.868062   54581 cri.go:89] found id: ""
	I1201 19:35:32.868075   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.868082   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:32.868090   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:32.868099   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:32.923266   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:32.923285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:32.934015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:32.934030   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:33.005502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:33.005512   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:33.005523   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:33.075965   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:33.075984   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.605628   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:35.617054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:35.617126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:35.646998   54581 cri.go:89] found id: ""
	I1201 19:35:35.647012   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.647019   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:35.647025   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:35.647086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:35.676130   54581 cri.go:89] found id: ""
	I1201 19:35:35.676143   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.676150   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:35.676155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:35.676211   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:35.700589   54581 cri.go:89] found id: ""
	I1201 19:35:35.700602   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.700609   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:35.700616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:35.700672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:35.725233   54581 cri.go:89] found id: ""
	I1201 19:35:35.725246   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.725253   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:35.725273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:35.725343   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:35.750382   54581 cri.go:89] found id: ""
	I1201 19:35:35.750396   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.750403   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:35.750408   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:35.750462   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:35.775219   54581 cri.go:89] found id: ""
	I1201 19:35:35.775235   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.775243   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:35.775248   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:35.775320   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:35.800831   54581 cri.go:89] found id: ""
	I1201 19:35:35.800845   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.800852   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:35.800859   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:35.800870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:35.866740   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:35.866756   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:35.866767   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:35.931013   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:35.931031   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.958721   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:35.958743   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:36.015847   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:36.015863   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.535518   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:38.545931   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:38.545993   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:38.571083   54581 cri.go:89] found id: ""
	I1201 19:35:38.571097   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.571104   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:38.571109   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:38.571170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:38.608738   54581 cri.go:89] found id: ""
	I1201 19:35:38.608752   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.608759   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:38.608765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:38.608820   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:38.635605   54581 cri.go:89] found id: ""
	I1201 19:35:38.635619   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.635626   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:38.635631   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:38.635689   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:38.668134   54581 cri.go:89] found id: ""
	I1201 19:35:38.668147   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.668155   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:38.668172   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:38.668231   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:38.693505   54581 cri.go:89] found id: ""
	I1201 19:35:38.693519   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.693526   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:38.693531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:38.693602   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:38.719017   54581 cri.go:89] found id: ""
	I1201 19:35:38.719031   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.719039   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:38.719044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:38.719103   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:38.748727   54581 cri.go:89] found id: ""
	I1201 19:35:38.748740   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.748747   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:38.748754   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:38.748765   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:38.778021   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:38.778037   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:38.838504   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:38.838524   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.851587   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:38.851603   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:38.919080   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:38.919115   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:38.919130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.484602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:41.495239   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:41.495298   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:41.525151   54581 cri.go:89] found id: ""
	I1201 19:35:41.525165   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.525172   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:41.525191   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:41.525256   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:41.551287   54581 cri.go:89] found id: ""
	I1201 19:35:41.551301   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.551309   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:41.551329   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:41.551392   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:41.577108   54581 cri.go:89] found id: ""
	I1201 19:35:41.577124   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.577131   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:41.577136   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:41.577204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:41.613970   54581 cri.go:89] found id: ""
	I1201 19:35:41.613983   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.613991   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:41.614005   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:41.614063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:41.647948   54581 cri.go:89] found id: ""
	I1201 19:35:41.647961   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.647968   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:41.647973   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:41.648038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:41.675741   54581 cri.go:89] found id: ""
	I1201 19:35:41.675754   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.675761   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:41.675770   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:41.675827   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:41.701031   54581 cri.go:89] found id: ""
	I1201 19:35:41.701053   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.701061   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:41.701068   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:41.701079   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:41.712066   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:41.712081   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:41.774820   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:41.774852   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:41.774864   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.837237   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:41.837254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:41.867407   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:41.867423   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.425417   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:44.436694   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:44.436764   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:44.462550   54581 cri.go:89] found id: ""
	I1201 19:35:44.462565   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.462571   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:44.462577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:44.462634   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:44.490237   54581 cri.go:89] found id: ""
	I1201 19:35:44.490250   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.490257   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:44.490262   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:44.490318   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:44.517417   54581 cri.go:89] found id: ""
	I1201 19:35:44.517431   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.517438   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:44.517443   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:44.517523   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:44.542502   54581 cri.go:89] found id: ""
	I1201 19:35:44.542516   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.542523   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:44.542528   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:44.542588   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:44.568636   54581 cri.go:89] found id: ""
	I1201 19:35:44.568650   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.568682   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:44.568688   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:44.568756   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:44.602872   54581 cri.go:89] found id: ""
	I1201 19:35:44.602891   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.602898   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:44.602904   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:44.602961   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:44.633265   54581 cri.go:89] found id: ""
	I1201 19:35:44.633280   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.633287   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:44.633295   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:44.633305   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:44.704029   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:44.704040   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:44.704051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:44.768055   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:44.768075   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:44.797083   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:44.797098   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.852537   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:44.852555   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.364630   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:47.374921   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:47.374978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:47.399587   54581 cri.go:89] found id: ""
	I1201 19:35:47.399600   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.399607   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:47.399613   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:47.399672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:47.426120   54581 cri.go:89] found id: ""
	I1201 19:35:47.426134   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.426141   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:47.426147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:47.426227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:47.457662   54581 cri.go:89] found id: ""
	I1201 19:35:47.457676   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.457683   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:47.457689   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:47.457747   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:47.482682   54581 cri.go:89] found id: ""
	I1201 19:35:47.482702   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.482709   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:47.482728   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:47.482796   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:47.511319   54581 cri.go:89] found id: ""
	I1201 19:35:47.511334   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.511341   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:47.511346   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:47.511409   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:47.543730   54581 cri.go:89] found id: ""
	I1201 19:35:47.543742   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.543760   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:47.543765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:47.543831   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:47.572333   54581 cri.go:89] found id: ""
	I1201 19:35:47.572347   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.572355   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:47.572363   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:47.572385   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:47.637165   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:47.637184   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.648940   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:47.648956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:47.711651   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:47.711662   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:47.711681   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:47.773144   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:47.773163   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:50.303086   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:50.313234   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:50.313293   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:50.337483   54581 cri.go:89] found id: ""
	I1201 19:35:50.337515   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.337522   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:50.337527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:50.337583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:50.363911   54581 cri.go:89] found id: ""
	I1201 19:35:50.363927   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.363934   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:50.363939   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:50.363994   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:50.388359   54581 cri.go:89] found id: ""
	I1201 19:35:50.388373   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.388380   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:50.388386   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:50.388441   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:50.412983   54581 cri.go:89] found id: ""
	I1201 19:35:50.412996   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.413003   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:50.413014   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:50.413073   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:50.440996   54581 cri.go:89] found id: ""
	I1201 19:35:50.441017   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.441024   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:50.441030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:50.441085   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:50.467480   54581 cri.go:89] found id: ""
	I1201 19:35:50.467493   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.467501   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:50.467506   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:50.467567   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:50.494388   54581 cri.go:89] found id: ""
	I1201 19:35:50.494402   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.494409   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:50.494416   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:50.494427   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:50.550339   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:50.550359   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:50.561242   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:50.561258   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:50.633849   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:50.633860   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:50.633870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:50.702260   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:50.702280   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:53.234959   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:53.245018   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:53.245083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:53.276399   54581 cri.go:89] found id: ""
	I1201 19:35:53.276413   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.276420   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:53.276425   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:53.276491   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:53.305853   54581 cri.go:89] found id: ""
	I1201 19:35:53.305866   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.305873   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:53.305878   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:53.305935   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:53.335241   54581 cri.go:89] found id: ""
	I1201 19:35:53.335255   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.335263   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:53.335269   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:53.335328   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:53.359467   54581 cri.go:89] found id: ""
	I1201 19:35:53.359481   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.359488   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:53.359493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:53.359550   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:53.384120   54581 cri.go:89] found id: ""
	I1201 19:35:53.384134   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.384141   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:53.384147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:53.384203   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:53.414128   54581 cri.go:89] found id: ""
	I1201 19:35:53.414141   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.414149   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:53.414155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:53.414214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:53.439408   54581 cri.go:89] found id: ""
	I1201 19:35:53.439421   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.439428   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:53.439436   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:53.439446   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:53.495007   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:53.495026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:53.505932   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:53.505948   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:53.572678   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:53.572688   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:53.572702   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:53.650600   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:53.650621   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:56.183319   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:56.193782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:56.193843   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:56.224114   54581 cri.go:89] found id: ""
	I1201 19:35:56.224128   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.224135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:56.224140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:56.224197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:56.254013   54581 cri.go:89] found id: ""
	I1201 19:35:56.254027   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.254034   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:56.254040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:56.254102   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:56.279886   54581 cri.go:89] found id: ""
	I1201 19:35:56.279900   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.279908   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:56.279914   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:56.279976   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:56.304943   54581 cri.go:89] found id: ""
	I1201 19:35:56.304956   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.304963   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:56.304969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:56.305025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:56.328633   54581 cri.go:89] found id: ""
	I1201 19:35:56.328647   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.328654   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:56.328659   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:56.328715   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:56.357255   54581 cri.go:89] found id: ""
	I1201 19:35:56.357269   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.357276   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:56.357281   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:56.357340   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:56.381420   54581 cri.go:89] found id: ""
	I1201 19:35:56.381434   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.381441   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:56.381449   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:56.381459   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:56.439709   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:56.439728   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:56.450590   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:56.450605   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:56.516412   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:56.516423   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:56.516435   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:56.577800   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:56.577828   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.114477   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:59.124117   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:59.124179   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:59.151351   54581 cri.go:89] found id: ""
	I1201 19:35:59.151364   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.151372   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:59.151377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:59.151433   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:59.179997   54581 cri.go:89] found id: ""
	I1201 19:35:59.180010   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.180017   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:59.180022   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:59.180084   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:59.204818   54581 cri.go:89] found id: ""
	I1201 19:35:59.204832   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.204859   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:59.204864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:59.204923   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:59.230443   54581 cri.go:89] found id: ""
	I1201 19:35:59.230456   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.230464   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:59.230470   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:59.230524   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:59.254548   54581 cri.go:89] found id: ""
	I1201 19:35:59.254561   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.254569   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:59.254574   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:59.254629   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:59.282564   54581 cri.go:89] found id: ""
	I1201 19:35:59.282577   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.282584   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:59.282590   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:59.282645   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:59.310544   54581 cri.go:89] found id: ""
	I1201 19:35:59.310557   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.310565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:59.310573   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:59.310587   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:59.377012   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:59.377021   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:59.377032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:59.441479   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:59.441511   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.471908   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:59.471924   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:59.527613   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:59.527631   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.040294   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:02.051787   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:02.051869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:02.077788   54581 cri.go:89] found id: ""
	I1201 19:36:02.077801   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.077808   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:02.077814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:02.077871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:02.103346   54581 cri.go:89] found id: ""
	I1201 19:36:02.103359   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.103366   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:02.103371   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:02.103427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:02.128949   54581 cri.go:89] found id: ""
	I1201 19:36:02.128963   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.128970   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:02.128975   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:02.129033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:02.153585   54581 cri.go:89] found id: ""
	I1201 19:36:02.153598   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.153605   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:02.153611   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:02.153668   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:02.180499   54581 cri.go:89] found id: ""
	I1201 19:36:02.180513   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.180520   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:02.180531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:02.180592   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:02.206116   54581 cri.go:89] found id: ""
	I1201 19:36:02.206131   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.206138   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:02.206144   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:02.206210   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:02.232470   54581 cri.go:89] found id: ""
	I1201 19:36:02.232484   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.232492   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:02.232500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:02.232513   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:02.295347   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:02.295367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:02.323002   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:02.323018   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:02.382028   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:02.382046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.393159   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:02.393176   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:02.457522   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:04.957729   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:04.967951   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:04.968012   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:04.993754   54581 cri.go:89] found id: ""
	I1201 19:36:04.993769   54581 logs.go:282] 0 containers: []
	W1201 19:36:04.993776   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:04.993782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:04.993844   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:05.019859   54581 cri.go:89] found id: ""
	I1201 19:36:05.019873   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.019881   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:05.019886   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:05.019943   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:05.047016   54581 cri.go:89] found id: ""
	I1201 19:36:05.047031   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.047038   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:05.047046   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:05.047107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:05.072292   54581 cri.go:89] found id: ""
	I1201 19:36:05.072306   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.072313   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:05.072318   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:05.072377   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:05.099842   54581 cri.go:89] found id: ""
	I1201 19:36:05.099857   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.099864   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:05.099870   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:05.099926   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:05.125552   54581 cri.go:89] found id: ""
	I1201 19:36:05.125566   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.125573   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:05.125579   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:05.125635   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:05.150637   54581 cri.go:89] found id: ""
	I1201 19:36:05.150651   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.150659   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:05.150667   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:05.150677   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:05.218391   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:05.218410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:05.246651   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:05.246670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:05.303677   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:05.303694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:05.314794   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:05.314809   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:05.380077   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:07.881622   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:07.893048   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:07.893109   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:07.918109   54581 cri.go:89] found id: ""
	I1201 19:36:07.918122   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.918129   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:07.918134   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:07.918196   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:07.943504   54581 cri.go:89] found id: ""
	I1201 19:36:07.943518   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.943525   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:07.943536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:07.943595   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:07.969943   54581 cri.go:89] found id: ""
	I1201 19:36:07.969958   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.969965   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:07.969971   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:07.970033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:07.994994   54581 cri.go:89] found id: ""
	I1201 19:36:07.995009   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.995015   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:07.995021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:07.995083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:08.020591   54581 cri.go:89] found id: ""
	I1201 19:36:08.020605   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.020612   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:08.020617   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:08.020676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:08.053041   54581 cri.go:89] found id: ""
	I1201 19:36:08.053056   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.053063   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:08.053069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:08.053129   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:08.084333   54581 cri.go:89] found id: ""
	I1201 19:36:08.084346   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.084353   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:08.084361   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:08.084371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:08.099534   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:08.099551   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:08.163985   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:08.163995   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:08.164006   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:08.224823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:08.224840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:08.256602   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:08.256618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:10.818842   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:10.829650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:10.829713   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:10.866261   54581 cri.go:89] found id: ""
	I1201 19:36:10.866275   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.866293   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:10.866299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:10.866378   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:10.902129   54581 cri.go:89] found id: ""
	I1201 19:36:10.902157   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.902166   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:10.902171   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:10.902287   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:10.935780   54581 cri.go:89] found id: ""
	I1201 19:36:10.935796   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.935803   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:10.935809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:10.935868   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:10.961965   54581 cri.go:89] found id: ""
	I1201 19:36:10.961979   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.961987   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:10.961993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:10.962050   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:10.988752   54581 cri.go:89] found id: ""
	I1201 19:36:10.988765   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.988772   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:10.988778   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:10.988855   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:11.013768   54581 cri.go:89] found id: ""
	I1201 19:36:11.013783   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.013790   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:11.013795   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:11.013852   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:11.039944   54581 cri.go:89] found id: ""
	I1201 19:36:11.039959   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.039982   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:11.039992   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:11.040003   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:11.096281   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:11.096300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:11.107964   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:11.107989   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:11.174240   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:11.174253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:11.174265   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:11.240383   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:11.240406   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.770524   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:13.780691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:13.780754   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:13.805306   54581 cri.go:89] found id: ""
	I1201 19:36:13.805321   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.805328   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:13.805333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:13.805390   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:13.830209   54581 cri.go:89] found id: ""
	I1201 19:36:13.830223   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.830229   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:13.830235   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:13.830294   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:13.859814   54581 cri.go:89] found id: ""
	I1201 19:36:13.859827   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.859834   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:13.859839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:13.859905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:13.888545   54581 cri.go:89] found id: ""
	I1201 19:36:13.888559   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.888567   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:13.888573   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:13.888642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:13.918445   54581 cri.go:89] found id: ""
	I1201 19:36:13.918459   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.918466   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:13.918471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:13.918530   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:13.944112   54581 cri.go:89] found id: ""
	I1201 19:36:13.944125   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.944132   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:13.944147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:13.944206   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:13.969842   54581 cri.go:89] found id: ""
	I1201 19:36:13.969856   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.969863   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:13.969872   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:13.969882   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.999132   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:13.999150   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:14.056959   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:14.056979   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:14.068288   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:14.068304   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:14.137988   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:14.137997   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:14.138008   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:16.704768   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:16.715111   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:16.715170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:16.740051   54581 cri.go:89] found id: ""
	I1201 19:36:16.740065   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.740072   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:16.740078   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:16.740150   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:16.765291   54581 cri.go:89] found id: ""
	I1201 19:36:16.765309   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.765317   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:16.765323   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:16.765380   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:16.790212   54581 cri.go:89] found id: ""
	I1201 19:36:16.790226   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.790233   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:16.790238   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:16.790297   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:16.814700   54581 cri.go:89] found id: ""
	I1201 19:36:16.814714   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.814721   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:16.814726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:16.814785   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:16.851986   54581 cri.go:89] found id: ""
	I1201 19:36:16.852000   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.852007   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:16.852012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:16.852067   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:16.883217   54581 cri.go:89] found id: ""
	I1201 19:36:16.883231   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.883237   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:16.883243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:16.883301   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:16.922552   54581 cri.go:89] found id: ""
	I1201 19:36:16.922566   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.922574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:16.922582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:16.922591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:16.982282   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:16.982300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:16.993387   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:16.993401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:17.063398   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:17.063409   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:17.063421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:17.125575   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:17.125594   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.654741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:19.665378   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:19.665445   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:19.690531   54581 cri.go:89] found id: ""
	I1201 19:36:19.690545   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.690553   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:19.690559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:19.690617   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:19.715409   54581 cri.go:89] found id: ""
	I1201 19:36:19.715423   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.715431   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:19.715436   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:19.715494   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:19.743995   54581 cri.go:89] found id: ""
	I1201 19:36:19.744009   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.744016   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:19.744021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:19.744078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:19.769191   54581 cri.go:89] found id: ""
	I1201 19:36:19.769204   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.769212   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:19.769217   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:19.769286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:19.793617   54581 cri.go:89] found id: ""
	I1201 19:36:19.793631   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.793638   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:19.793644   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:19.793704   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:19.818818   54581 cri.go:89] found id: ""
	I1201 19:36:19.818832   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.818840   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:19.818845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:19.818914   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:19.852332   54581 cri.go:89] found id: ""
	I1201 19:36:19.852346   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.852368   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:19.852378   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:19.852389   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.884627   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:19.884642   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:19.947006   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:19.947026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:19.958524   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:19.958539   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:20.040965   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:20.040976   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:20.040988   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:22.622750   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:22.637572   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:22.637637   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:22.667700   54581 cri.go:89] found id: ""
	I1201 19:36:22.667714   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.667721   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:22.667727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:22.667786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:22.700758   54581 cri.go:89] found id: ""
	I1201 19:36:22.700776   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.700802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:22.700815   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:22.700916   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:22.727217   54581 cri.go:89] found id: ""
	I1201 19:36:22.727230   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.727238   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:22.727243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:22.727299   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:22.753365   54581 cri.go:89] found id: ""
	I1201 19:36:22.753379   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.753386   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:22.753392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:22.753459   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:22.779306   54581 cri.go:89] found id: ""
	I1201 19:36:22.779320   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.779327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:22.779336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:22.779394   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:22.804830   54581 cri.go:89] found id: ""
	I1201 19:36:22.804844   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.804860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:22.804866   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:22.804924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:22.831440   54581 cri.go:89] found id: ""
	I1201 19:36:22.831470   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.831478   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:22.831486   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:22.831496   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:22.889394   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:22.889412   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:22.901968   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:22.901983   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:22.974567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:22.974577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:22.974588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:23.043112   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:23.043130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.573279   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:25.584019   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:25.584078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:25.613416   54581 cri.go:89] found id: ""
	I1201 19:36:25.613430   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.613446   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:25.613452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:25.613541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:25.638108   54581 cri.go:89] found id: ""
	I1201 19:36:25.638121   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.638132   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:25.638138   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:25.638198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:25.667581   54581 cri.go:89] found id: ""
	I1201 19:36:25.667596   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.667603   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:25.667608   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:25.667676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:25.695307   54581 cri.go:89] found id: ""
	I1201 19:36:25.695320   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.695328   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:25.695333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:25.695396   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:25.719360   54581 cri.go:89] found id: ""
	I1201 19:36:25.719386   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.719394   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:25.719399   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:25.719466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:25.745097   54581 cri.go:89] found id: ""
	I1201 19:36:25.745120   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.745127   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:25.745133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:25.745207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:25.769545   54581 cri.go:89] found id: ""
	I1201 19:36:25.769558   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.769565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:25.769573   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:25.769584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.799870   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:25.799887   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:25.856015   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:25.856035   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:25.868391   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:25.868407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:25.939423   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:25.939433   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:25.939443   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.503343   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:28.515763   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:28.515836   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:28.541630   54581 cri.go:89] found id: ""
	I1201 19:36:28.541644   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.541652   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:28.541657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:28.541728   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:28.568196   54581 cri.go:89] found id: ""
	I1201 19:36:28.568210   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.568217   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:28.568222   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:28.568280   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:28.593437   54581 cri.go:89] found id: ""
	I1201 19:36:28.593450   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.593457   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:28.593463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:28.593557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:28.619497   54581 cri.go:89] found id: ""
	I1201 19:36:28.619511   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.619518   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:28.619523   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:28.619583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:28.647866   54581 cri.go:89] found id: ""
	I1201 19:36:28.647880   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.647887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:28.647893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:28.647950   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:28.673922   54581 cri.go:89] found id: ""
	I1201 19:36:28.673935   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.673943   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:28.673949   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:28.674021   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:28.698912   54581 cri.go:89] found id: ""
	I1201 19:36:28.698926   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.698933   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:28.698941   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:28.698963   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:28.756082   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:28.756100   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:28.767897   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:28.767913   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:28.836301   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:28.836312   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:28.836330   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.907788   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:28.907807   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.438620   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:31.448979   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:31.449042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:31.475188   54581 cri.go:89] found id: ""
	I1201 19:36:31.475202   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.475209   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:31.475215   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:31.475281   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:31.500385   54581 cri.go:89] found id: ""
	I1201 19:36:31.500398   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.500405   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:31.500411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:31.500468   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:31.525394   54581 cri.go:89] found id: ""
	I1201 19:36:31.525407   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.525414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:31.525419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:31.525481   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:31.550792   54581 cri.go:89] found id: ""
	I1201 19:36:31.550808   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.550815   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:31.550821   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:31.550880   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:31.578076   54581 cri.go:89] found id: ""
	I1201 19:36:31.578090   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.578097   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:31.578102   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:31.578159   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:31.604021   54581 cri.go:89] found id: ""
	I1201 19:36:31.604035   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.604042   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:31.604047   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:31.604108   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:31.633105   54581 cri.go:89] found id: ""
	I1201 19:36:31.633119   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.633126   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:31.633134   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:31.633145   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.663524   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:31.663540   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:31.723171   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:31.723189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:31.734100   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:31.734115   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:31.796567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:31.796577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:31.796588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:34.366168   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:34.376457   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:34.376516   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:34.404948   54581 cri.go:89] found id: ""
	I1201 19:36:34.404977   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.404985   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:34.404991   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:34.405063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:34.431692   54581 cri.go:89] found id: ""
	I1201 19:36:34.431706   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.431713   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:34.431718   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:34.431779   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:34.456671   54581 cri.go:89] found id: ""
	I1201 19:36:34.456685   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.456692   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:34.456697   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:34.456755   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:34.481585   54581 cri.go:89] found id: ""
	I1201 19:36:34.481612   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.481620   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:34.481626   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:34.481696   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:34.506818   54581 cri.go:89] found id: ""
	I1201 19:36:34.506832   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.506839   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:34.506845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:34.506906   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:34.535407   54581 cri.go:89] found id: ""
	I1201 19:36:34.535421   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.535428   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:34.535433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:34.535492   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:34.561311   54581 cri.go:89] found id: ""
	I1201 19:36:34.561324   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.561331   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:34.561339   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:34.561350   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:34.592150   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:34.592167   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:34.648352   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:34.648370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:34.659451   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:34.659467   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:34.728942   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:34.728952   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:34.728962   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:37.291213   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:37.301261   54581 kubeadm.go:602] duration metric: took 4m4.008784532s to restartPrimaryControlPlane
	W1201 19:36:37.301323   54581 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1201 19:36:37.301393   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:36:37.706665   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:36:37.720664   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:36:37.728529   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:36:37.728581   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:36:37.736430   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:36:37.736440   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:36:37.736492   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:36:37.744494   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:36:37.744550   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:36:37.752457   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:36:37.760187   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:36:37.760243   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:36:37.768060   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.775900   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:36:37.775969   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.783655   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:36:37.791670   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:36:37.791723   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:36:37.799641   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:36:37.841794   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:36:37.841853   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:36:37.909907   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:36:37.909969   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:36:37.910004   54581 kubeadm.go:319] OS: Linux
	I1201 19:36:37.910048   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:36:37.910095   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:36:37.910141   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:36:37.910188   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:36:37.910235   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:36:37.910281   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:36:37.910325   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:36:37.910372   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:36:37.910417   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:36:37.982652   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:36:37.982760   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:36:37.982849   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:36:37.989962   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:36:37.995459   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:36:37.995557   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:36:37.995632   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:36:37.995718   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:36:37.995796   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:36:37.995875   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:36:37.995938   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:36:37.996008   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:36:37.996076   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:36:37.996160   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:36:37.996243   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:36:37.996290   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:36:37.996352   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:36:38.264574   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:36:38.510797   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:36:39.269570   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:36:39.443703   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:36:40.036623   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:36:40.036725   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:36:40.042253   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:36:40.045573   54581 out.go:252]   - Booting up control plane ...
	I1201 19:36:40.045681   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:36:40.045758   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:36:40.050263   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:36:40.088031   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:36:40.088133   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:36:40.088246   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:36:40.088332   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:36:40.088370   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:36:40.243689   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:36:40.243803   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:40:40.243834   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000165379s
	I1201 19:40:40.243866   54581 kubeadm.go:319] 
	I1201 19:40:40.243923   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:40:40.243956   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:40:40.244085   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:40:40.244090   54581 kubeadm.go:319] 
	I1201 19:40:40.244193   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:40:40.244226   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:40:40.244256   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:40:40.244260   54581 kubeadm.go:319] 
	I1201 19:40:40.248975   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:40:40.249435   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:40:40.249566   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:40:40.249901   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 19:40:40.249908   54581 kubeadm.go:319] 
	I1201 19:40:40.249980   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1201 19:40:40.250118   54581 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000165379s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1201 19:40:40.250247   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:40:40.662369   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:40:40.675843   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:40:40.675896   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:40:40.683554   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:40:40.683563   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:40:40.683613   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:40:40.691612   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:40:40.691669   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:40:40.699280   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:40:40.706997   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:40:40.707052   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:40:40.714497   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.722891   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:40:40.722949   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.730907   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:40:40.739761   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:40:40.739818   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:40:40.747474   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:40:40.788983   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:40:40.789292   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:40:40.865634   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:40:40.865697   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:40:40.865734   54581 kubeadm.go:319] OS: Linux
	I1201 19:40:40.865777   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:40:40.865824   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:40:40.865869   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:40:40.865916   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:40:40.865963   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:40:40.866013   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:40:40.866057   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:40:40.866104   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:40:40.866149   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:40:40.935875   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:40:40.935986   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:40:40.936084   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:40:40.941886   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:40:40.947334   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:40:40.947424   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:40:40.947488   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:40:40.947568   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:40:40.947628   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:40:40.947696   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:40:40.947749   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:40:40.947810   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:40:40.947870   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:40:40.947944   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:40:40.948014   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:40:40.948051   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:40:40.948105   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:40:41.580020   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:40:42.099824   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:40:42.537556   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:40:42.996026   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:40:43.565704   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:40:43.566397   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:40:43.569105   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:40:43.572244   54581 out.go:252]   - Booting up control plane ...
	I1201 19:40:43.572342   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:40:43.572765   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:40:43.573983   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:40:43.595015   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:40:43.595116   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:40:43.603073   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:40:43.603347   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:40:43.603559   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:40:43.744445   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:40:43.744558   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:44:43.744318   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000287424s
	I1201 19:44:43.744348   54581 kubeadm.go:319] 
	I1201 19:44:43.744432   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:44:43.744486   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:44:43.744623   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:44:43.744628   54581 kubeadm.go:319] 
	I1201 19:44:43.744749   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:44:43.744781   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:44:43.744822   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:44:43.744831   54581 kubeadm.go:319] 
	I1201 19:44:43.748926   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:44:43.749322   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:44:43.749424   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:44:43.749683   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1201 19:44:43.749689   54581 kubeadm.go:319] 
	I1201 19:44:43.749753   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 19:44:43.749803   54581 kubeadm.go:403] duration metric: took 12m10.492478835s to StartCluster
	I1201 19:44:43.749833   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:44:43.749893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:44:43.774966   54581 cri.go:89] found id: ""
	I1201 19:44:43.774979   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.774986   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:44:43.774992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:44:43.775053   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:44:43.800769   54581 cri.go:89] found id: ""
	I1201 19:44:43.800783   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.800790   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:44:43.800796   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:44:43.800854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:44:43.827282   54581 cri.go:89] found id: ""
	I1201 19:44:43.827295   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.827302   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:44:43.827308   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:44:43.827364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:44:43.853930   54581 cri.go:89] found id: ""
	I1201 19:44:43.853944   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.853951   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:44:43.853957   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:44:43.854013   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:44:43.882816   54581 cri.go:89] found id: ""
	I1201 19:44:43.882830   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.882837   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:44:43.882843   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:44:43.882903   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:44:43.909261   54581 cri.go:89] found id: ""
	I1201 19:44:43.909274   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.909281   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:44:43.909287   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:44:43.909344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:44:43.933693   54581 cri.go:89] found id: ""
	I1201 19:44:43.933706   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.933715   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:44:43.933724   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:44:43.933733   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:44:43.990075   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:44:43.990092   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:44:44.001155   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:44:44.001170   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:44:44.070458   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:44:44.070469   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:44:44.070479   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:44:44.136228   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:44:44.136248   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 19:44:44.166389   54581 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 19:44:44.166422   54581 out.go:285] * 
	W1201 19:44:44.166485   54581 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.166502   54581 out.go:285] * 
	W1201 19:44:44.168627   54581 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:44:44.175592   54581 out.go:203] 
	W1201 19:44:44.179124   54581 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.179186   54581 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 19:44:44.179207   54581 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 19:44:44.182569   54581 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667922542Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667937483Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667949232Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667962787Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668042514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668060319Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668082095Z" level=info msg="runtime interface created"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668088528Z" level=info msg="created NRI interface"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668104946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668151788Z" level=info msg="Connect containerd service"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668662446Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.670384243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682522727Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682782323Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682944050Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682830321Z" level=info msg="Start recovering state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708040674Z" level=info msg="Start event monitor"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708239258Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708327772Z" level=info msg="Start streaming server"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708412037Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708599093Z" level=info msg="runtime interface starting up..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708668573Z" level=info msg="starting plugins..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708729215Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708948574Z" level=info msg="containerd successfully booted in 0.060821s"
	Dec 01 19:32:31 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:44:45.568723   21656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:45.569351   21656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:45.570891   21656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:45.571316   21656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:45.572784   21656 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:44:45 up  1:27,  0 user,  load average: 0.05, 0.19, 0.39
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:44:42 functional-428744 kubelet[21460]: E1201 19:44:42.638631   21460 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:42 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:42 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:43 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 01 19:44:43 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:43 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:43 functional-428744 kubelet[21466]: E1201 19:44:43.388577   21466 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:43 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:43 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 01 19:44:44 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:44 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:44 functional-428744 kubelet[21540]: E1201 19:44:44.150010   21540 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 01 19:44:44 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:44 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:44 functional-428744 kubelet[21572]: E1201 19:44:44.912464   21572 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:45 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 01 19:44:45 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:45 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:45 functional-428744 kubelet[21661]: E1201 19:44:45.644227   21661 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (385.897833ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig (737.97s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth (2.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-428744 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: (dbg) Non-zero exit: kubectl --context functional-428744 get po -l tier=control-plane -n kube-system -o=json: exit status 1 (60.101998ms)

                                                
                                                
-- stdout --
	{
	    "apiVersion": "v1",
	    "items": [],
	    "kind": "List",
	    "metadata": {
	        "resourceVersion": ""
	    }
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:827: failed to get components. args "kubectl --context functional-428744 get po -l tier=control-plane -n kube-system -o=json": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (308.39913ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 logs -n 25: (1.002885893s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-019259 image ls --format short --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format json --alsologtostderr                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls --format table --alsologtostderr                                                                                             │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ ssh     │ functional-019259 ssh pgrep buildkitd                                                                                                                   │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ image   │ functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr                                                  │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ image   │ functional-019259 image ls                                                                                                                              │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ delete  │ -p functional-019259                                                                                                                                    │ functional-019259 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │ 01 Dec 25 19:17 UTC │
	│ start   │ -p functional-428744 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:17 UTC │                     │
	│ start   │ -p functional-428744 --alsologtostderr -v=8                                                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:26 UTC │                     │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add registry.k8s.io/pause:latest                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache add minikube-local-cache-test:functional-428744                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ functional-428744 cache delete minikube-local-cache-test:functional-428744                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl images                                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ cache   │ functional-428744 cache reload                                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ kubectl │ functional-428744 kubectl -- --context functional-428744 get pods                                                                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ start   │ -p functional-428744 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:32:28
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:32:28.671063   54581 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:32:28.671177   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671181   54581 out.go:374] Setting ErrFile to fd 2...
	I1201 19:32:28.671185   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671462   54581 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:32:28.671791   54581 out.go:368] Setting JSON to false
	I1201 19:32:28.672593   54581 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4500,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:32:28.672645   54581 start.go:143] virtualization:  
	I1201 19:32:28.676118   54581 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:32:28.679062   54581 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:32:28.679153   54581 notify.go:221] Checking for updates...
	I1201 19:32:28.685968   54581 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:32:28.688852   54581 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:32:28.691733   54581 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:32:28.694613   54581 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:32:28.697549   54581 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:32:28.700837   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:28.700934   54581 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:32:28.730800   54581 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:32:28.730894   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.786972   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.776963779 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.787065   54581 docker.go:319] overlay module found
	I1201 19:32:28.789990   54581 out.go:179] * Using the docker driver based on existing profile
	I1201 19:32:28.792702   54581 start.go:309] selected driver: docker
	I1201 19:32:28.792712   54581 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.792814   54581 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:32:28.792926   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.854079   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.841219008 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.854498   54581 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1201 19:32:28.854520   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:28.854580   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:28.854619   54581 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.858061   54581 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:32:28.860972   54581 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:32:28.863997   54581 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:32:28.866788   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:28.866980   54581 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:32:28.895611   54581 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:32:28.895623   54581 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:32:28.922565   54581 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:32:29.117617   54581 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:32:29.117759   54581 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:32:29.117789   54581 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117872   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:32:29.117882   54581 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 108.863µs
	I1201 19:32:29.117888   54581 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:32:29.117898   54581 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117926   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:32:29.117930   54581 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 33.443µs
	I1201 19:32:29.117935   54581 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117944   54581 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117979   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:32:29.117983   54581 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.647µs
	I1201 19:32:29.117988   54581 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117998   54581 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118023   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:32:29.118035   54581 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 30.974µs
	I1201 19:32:29.118040   54581 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118040   54581 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:32:29.118048   54581 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118072   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:32:29.118066   54581 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118075   54581 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 28.709µs
	I1201 19:32:29.118080   54581 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118088   54581 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118102   54581 start.go:364] duration metric: took 25.197µs to acquireMachinesLock for "functional-428744"
	I1201 19:32:29.118113   54581 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:32:29.118114   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:32:29.118117   54581 fix.go:54] fixHost starting: 
	I1201 19:32:29.118118   54581 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.457µs
	I1201 19:32:29.118122   54581 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:32:29.118129   54581 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118152   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:32:29.118156   54581 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 27.199µs
	I1201 19:32:29.118160   54581 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:32:29.118167   54581 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118216   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:32:29.118220   54581 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 53.562µs
	I1201 19:32:29.118229   54581 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:32:29.118236   54581 cache.go:87] Successfully saved all images to host disk.
	I1201 19:32:29.118392   54581 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:32:29.135509   54581 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:32:29.135543   54581 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:32:29.140504   54581 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:32:29.140530   54581 machine.go:94] provisionDockerMachine start ...
	I1201 19:32:29.140609   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.157677   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.157997   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.158004   54581 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:32:29.305012   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.305026   54581 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:32:29.305098   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.323134   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.323429   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.323437   54581 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:32:29.478458   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.478532   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.497049   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.498161   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.498184   54581 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:32:29.645663   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:32:29.645679   54581 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:32:29.645696   54581 ubuntu.go:190] setting up certificates
	I1201 19:32:29.645703   54581 provision.go:84] configureAuth start
	I1201 19:32:29.645772   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:29.663161   54581 provision.go:143] copyHostCerts
	I1201 19:32:29.663227   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:32:29.663233   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:32:29.663306   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:32:29.663413   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:32:29.663416   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:32:29.663441   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:32:29.663488   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:32:29.663496   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:32:29.663517   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:32:29.663560   54581 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:32:29.922590   54581 provision.go:177] copyRemoteCerts
	I1201 19:32:29.922645   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:32:29.922682   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.944750   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.066257   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:32:30.114189   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:32:30.139869   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:32:30.162018   54581 provision.go:87] duration metric: took 516.289617ms to configureAuth
	I1201 19:32:30.162044   54581 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:32:30.162294   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:30.162301   54581 machine.go:97] duration metric: took 1.021765793s to provisionDockerMachine
	I1201 19:32:30.162308   54581 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:32:30.162319   54581 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:32:30.162368   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:32:30.162422   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.181979   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.285977   54581 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:32:30.289531   54581 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:32:30.289549   54581 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:32:30.289559   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:32:30.289616   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:32:30.289694   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:32:30.289767   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:32:30.289821   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:32:30.297763   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:30.315893   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:32:30.335096   54581 start.go:296] duration metric: took 172.774471ms for postStartSetup
	I1201 19:32:30.335168   54581 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:32:30.335214   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.355398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.458545   54581 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:32:30.463103   54581 fix.go:56] duration metric: took 1.344978374s for fixHost
	I1201 19:32:30.463118   54581 start.go:83] releasing machines lock for "functional-428744", held for 1.345010357s
	I1201 19:32:30.463185   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:30.480039   54581 ssh_runner.go:195] Run: cat /version.json
	I1201 19:32:30.480081   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.480337   54581 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:32:30.480395   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.499221   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.501398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.601341   54581 ssh_runner.go:195] Run: systemctl --version
	I1201 19:32:30.695138   54581 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 19:32:30.699523   54581 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:32:30.699612   54581 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:32:30.707379   54581 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:32:30.707392   54581 start.go:496] detecting cgroup driver to use...
	I1201 19:32:30.707423   54581 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:32:30.707469   54581 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:32:30.722782   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:32:30.736023   54581 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:32:30.736084   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:32:30.751857   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:32:30.765106   54581 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:32:30.881005   54581 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:32:31.019194   54581 docker.go:234] disabling docker service ...
	I1201 19:32:31.019259   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:32:31.037044   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:32:31.052926   54581 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:32:31.181456   54581 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:32:31.340481   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:32:31.355001   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:32:31.370840   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:32:31.380231   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:32:31.389693   54581 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:32:31.389764   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:32:31.399360   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.408437   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:32:31.417370   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.426455   54581 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:32:31.434636   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:32:31.443735   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:32:31.453324   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:32:31.462516   54581 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:32:31.470270   54581 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:32:31.478172   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:31.592137   54581 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:32:31.712107   54581 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:32:31.712186   54581 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:32:31.715994   54581 start.go:564] Will wait 60s for crictl version
	I1201 19:32:31.716056   54581 ssh_runner.go:195] Run: which crictl
	I1201 19:32:31.719610   54581 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:32:31.745073   54581 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:32:31.745152   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.765358   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.791628   54581 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:32:31.794721   54581 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:32:31.811133   54581 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:32:31.818179   54581 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1201 19:32:31.821064   54581 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:32:31.821193   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:31.821269   54581 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:32:31.856356   54581 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:32:31.856368   54581 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:32:31.856374   54581 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:32:31.856475   54581 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:32:31.856536   54581 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:32:31.895308   54581 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1201 19:32:31.895325   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:31.895333   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:31.895346   54581 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:32:31.895366   54581 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:32:31.895478   54581 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:32:31.895541   54581 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:32:31.905339   54581 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:32:31.905406   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:32:31.913323   54581 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:32:31.927846   54581 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:32:31.940396   54581 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1201 19:32:31.953139   54581 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:32:31.956806   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:32.073166   54581 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:32:32.587407   54581 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:32:32.587419   54581 certs.go:195] generating shared ca certs ...
	I1201 19:32:32.587436   54581 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:32:32.587628   54581 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:32:32.587672   54581 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:32:32.587679   54581 certs.go:257] generating profile certs ...
	I1201 19:32:32.587796   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:32:32.587858   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:32:32.587895   54581 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:32:32.588027   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:32:32.588060   54581 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:32:32.588067   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:32:32.588104   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:32:32.588128   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:32:32.588158   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:32:32.588202   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:32.589935   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:32:32.611510   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:32:32.631449   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:32:32.652864   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:32:32.672439   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:32:32.690857   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:32:32.709160   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:32:32.727076   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:32:32.745055   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:32:32.762625   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:32:32.780355   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:32:32.797626   54581 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:32:32.810250   54581 ssh_runner.go:195] Run: openssl version
	I1201 19:32:32.816425   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:32:32.825294   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829094   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829148   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.869893   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:32:32.877720   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:32:32.886198   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889911   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889967   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.930479   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:32:32.938463   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:32:32.946940   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950621   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950676   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.991499   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:32:32.999452   54581 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:32:33.003313   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:32:33.045305   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:32:33.087269   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:32:33.128376   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:32:33.169796   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:32:33.211259   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:32:33.257335   54581 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:33.257412   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:32:33.257501   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.284260   54581 cri.go:89] found id: ""
	I1201 19:32:33.284320   54581 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:32:33.292458   54581 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:32:33.292468   54581 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:32:33.292518   54581 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:32:33.300158   54581 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.300668   54581 kubeconfig.go:125] found "functional-428744" server: "https://192.168.49.2:8441"
	I1201 19:32:33.301960   54581 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:32:33.310120   54581 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-01 19:17:59.066738599 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-01 19:32:31.946987775 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1201 19:32:33.310138   54581 kubeadm.go:1161] stopping kube-system containers ...
	I1201 19:32:33.310149   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1201 19:32:33.310213   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.338492   54581 cri.go:89] found id: ""
	I1201 19:32:33.338551   54581 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1201 19:32:33.356342   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:32:33.364607   54581 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec  1 19:22 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec  1 19:22 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5676 Dec  1 19:22 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5588 Dec  1 19:22 /etc/kubernetes/scheduler.conf
	
	I1201 19:32:33.364669   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:32:33.372608   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:32:33.380647   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.380700   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:32:33.388464   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.397123   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.397189   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.404816   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:32:33.412562   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.412628   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:32:33.420390   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:32:33.428330   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:33.477124   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.484075   54581 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.006926734s)
	I1201 19:32:34.484135   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.694382   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.769616   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.812433   54581 api_server.go:52] waiting for apiserver process to appear ...
	I1201 19:32:34.812505   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.313033   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.812993   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.312704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.813245   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.313300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.812687   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.312636   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.813205   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.813572   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.312587   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.812696   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.313535   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.813472   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.813224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.313067   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.813328   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.312678   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.813484   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.312731   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.812683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.313429   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.813026   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.312606   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.812689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.813689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.313474   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.312618   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.813410   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.313371   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.812979   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.312792   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.812691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.313042   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.813445   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.313212   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.812741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.312722   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.812580   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.313621   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.813459   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.313224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.812880   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.313609   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.313283   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.812739   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.313558   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.813248   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.313098   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.813623   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.313600   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.813357   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.312559   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.812827   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.312653   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.812616   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.313447   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.813117   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.312712   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.812713   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.314198   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.313642   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.813457   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.313464   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.812697   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.312626   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.813299   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.813267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.312931   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.812887   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.312894   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.813197   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.312689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.812595   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.313557   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.313428   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.813327   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.313520   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.812744   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.313564   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.812611   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.313634   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.813393   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.313426   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.812688   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.313372   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.812638   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.313360   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.812897   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.313015   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.813101   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.312709   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.812907   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.312644   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.812569   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.313009   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.813448   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.312851   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.813268   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.313602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.312692   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.813538   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.313307   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.813008   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.313397   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.313454   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.813423   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.313344   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.813145   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.312690   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.813369   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:34.813443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:34.847624   54581 cri.go:89] found id: ""
	I1201 19:33:34.847638   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.847645   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:34.847650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:34.847707   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:34.877781   54581 cri.go:89] found id: ""
	I1201 19:33:34.877795   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.877802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:34.877807   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:34.877865   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:34.906556   54581 cri.go:89] found id: ""
	I1201 19:33:34.906569   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.906575   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:34.906581   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:34.906638   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:34.932243   54581 cri.go:89] found id: ""
	I1201 19:33:34.932257   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.932264   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:34.932275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:34.932334   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:34.958307   54581 cri.go:89] found id: ""
	I1201 19:33:34.958320   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.958327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:34.958333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:34.958393   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:34.987839   54581 cri.go:89] found id: ""
	I1201 19:33:34.987852   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.987860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:34.987865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:34.987924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:35.013339   54581 cri.go:89] found id: ""
	I1201 19:33:35.013353   54581 logs.go:282] 0 containers: []
	W1201 19:33:35.013360   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:35.013367   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:35.013377   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:35.024284   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:35.024300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:35.102562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:35.102584   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:35.102595   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:35.168823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:35.168843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:35.200459   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:35.200475   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:37.759267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:37.769446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:37.769528   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:37.794441   54581 cri.go:89] found id: ""
	I1201 19:33:37.794454   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.794461   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:37.794467   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:37.794522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:37.825029   54581 cri.go:89] found id: ""
	I1201 19:33:37.825042   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.825049   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:37.825059   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:37.825116   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:37.855847   54581 cri.go:89] found id: ""
	I1201 19:33:37.855860   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.855867   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:37.855872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:37.855932   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:37.892812   54581 cri.go:89] found id: ""
	I1201 19:33:37.892826   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.892833   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:37.892839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:37.892902   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:37.923175   54581 cri.go:89] found id: ""
	I1201 19:33:37.923189   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.923195   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:37.923201   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:37.923260   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:37.956838   54581 cri.go:89] found id: ""
	I1201 19:33:37.956852   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.956858   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:37.956864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:37.956921   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:37.983288   54581 cri.go:89] found id: ""
	I1201 19:33:37.983302   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.983309   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:37.983317   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:37.983328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:38.048803   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:38.048828   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:38.048842   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:38.114525   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:38.114549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:38.144040   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:38.144056   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:38.203160   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:38.203178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:40.714632   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:40.724993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:40.725058   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:40.749954   54581 cri.go:89] found id: ""
	I1201 19:33:40.749968   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.749975   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:40.749981   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:40.750040   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:40.775337   54581 cri.go:89] found id: ""
	I1201 19:33:40.775350   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.775357   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:40.775362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:40.775425   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:40.801568   54581 cri.go:89] found id: ""
	I1201 19:33:40.801582   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.801590   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:40.801595   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:40.801663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:40.829766   54581 cri.go:89] found id: ""
	I1201 19:33:40.829779   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.829786   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:40.829791   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:40.829850   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:40.864362   54581 cri.go:89] found id: ""
	I1201 19:33:40.864376   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.864383   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:40.864389   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:40.864447   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:40.893407   54581 cri.go:89] found id: ""
	I1201 19:33:40.893419   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.893427   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:40.893433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:40.893507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:40.919149   54581 cri.go:89] found id: ""
	I1201 19:33:40.919163   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.919172   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:40.919179   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:40.919189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:40.949474   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:40.949572   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:41.005421   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:41.005440   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:41.016259   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:41.016274   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:41.078378   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:41.078391   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:41.078401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:43.641960   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:43.652106   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:43.652178   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:43.682005   54581 cri.go:89] found id: ""
	I1201 19:33:43.682018   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.682025   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:43.682030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:43.682087   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:43.707580   54581 cri.go:89] found id: ""
	I1201 19:33:43.707593   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.707600   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:43.707606   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:43.707711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:43.732400   54581 cri.go:89] found id: ""
	I1201 19:33:43.732414   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.732421   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:43.732426   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:43.732483   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:43.758218   54581 cri.go:89] found id: ""
	I1201 19:33:43.758232   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.758239   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:43.758245   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:43.758303   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:43.783139   54581 cri.go:89] found id: ""
	I1201 19:33:43.783152   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.783159   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:43.783164   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:43.783227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:43.813453   54581 cri.go:89] found id: ""
	I1201 19:33:43.813467   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.813474   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:43.813480   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:43.813548   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:43.845612   54581 cri.go:89] found id: ""
	I1201 19:33:43.845625   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.845632   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:43.845639   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:43.845649   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:43.909426   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:43.909445   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:43.920543   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:43.920560   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:43.988764   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:43.988776   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:43.988797   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:44.051182   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:44.051208   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:46.583925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:46.594468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:46.594554   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:46.620265   54581 cri.go:89] found id: ""
	I1201 19:33:46.620279   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.620286   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:46.620292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:46.620351   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:46.644633   54581 cri.go:89] found id: ""
	I1201 19:33:46.644652   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.644659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:46.644665   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:46.644721   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:46.669867   54581 cri.go:89] found id: ""
	I1201 19:33:46.669881   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.669888   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:46.669893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:46.669948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:46.694417   54581 cri.go:89] found id: ""
	I1201 19:33:46.694431   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.694438   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:46.694454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:46.694512   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:46.721029   54581 cri.go:89] found id: ""
	I1201 19:33:46.721043   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.721051   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:46.721056   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:46.721114   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:46.747445   54581 cri.go:89] found id: ""
	I1201 19:33:46.747459   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.747466   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:46.747471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:46.747525   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:46.771251   54581 cri.go:89] found id: ""
	I1201 19:33:46.771266   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.771272   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:46.771281   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:46.771290   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:46.829699   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:46.829716   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:46.842077   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:46.842096   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:46.924213   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:46.924225   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:46.924235   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:46.990853   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:46.990872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:49.521683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:49.531974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:49.532042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:49.557473   54581 cri.go:89] found id: ""
	I1201 19:33:49.557514   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.557521   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:49.557527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:49.557640   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:49.583182   54581 cri.go:89] found id: ""
	I1201 19:33:49.583229   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.583237   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:49.583242   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:49.583308   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:49.611533   54581 cri.go:89] found id: ""
	I1201 19:33:49.611546   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.611553   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:49.611559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:49.611615   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:49.637433   54581 cri.go:89] found id: ""
	I1201 19:33:49.637446   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.637460   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:49.637466   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:49.637558   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:49.667274   54581 cri.go:89] found id: ""
	I1201 19:33:49.667287   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.667294   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:49.667299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:49.667358   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:49.696772   54581 cri.go:89] found id: ""
	I1201 19:33:49.696790   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.696797   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:49.696803   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:49.696861   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:49.720607   54581 cri.go:89] found id: ""
	I1201 19:33:49.720621   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.720637   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:49.720645   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:49.720655   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:49.776412   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:49.776431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:49.787417   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:49.787432   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:49.862636   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:49.862647   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:49.862658   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:49.934395   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:49.934421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:52.463339   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:52.473586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:52.473650   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:52.503529   54581 cri.go:89] found id: ""
	I1201 19:33:52.503542   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.503549   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:52.503555   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:52.503618   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:52.531144   54581 cri.go:89] found id: ""
	I1201 19:33:52.531158   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.531165   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:52.531170   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:52.531228   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:52.556664   54581 cri.go:89] found id: ""
	I1201 19:33:52.556678   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.556685   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:52.556691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:52.556753   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:52.583782   54581 cri.go:89] found id: ""
	I1201 19:33:52.583796   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.583802   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:52.583808   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:52.583866   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:52.608468   54581 cri.go:89] found id: ""
	I1201 19:33:52.608481   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.608488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:52.608494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:52.608553   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:52.632068   54581 cri.go:89] found id: ""
	I1201 19:33:52.632081   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.632088   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:52.632093   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:52.632153   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:52.656905   54581 cri.go:89] found id: ""
	I1201 19:33:52.656919   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.656926   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:52.656934   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:52.656944   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:52.715322   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:52.715340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:52.725941   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:52.725956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:52.787814   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:52.787824   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:52.787835   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:52.857124   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:52.857146   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:55.384601   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:55.394657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:55.394724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:55.419003   54581 cri.go:89] found id: ""
	I1201 19:33:55.419016   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.419023   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:55.419028   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:55.419093   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:55.444043   54581 cri.go:89] found id: ""
	I1201 19:33:55.444057   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.444064   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:55.444069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:55.444126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:55.469199   54581 cri.go:89] found id: ""
	I1201 19:33:55.469212   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.469219   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:55.469224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:55.469284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:55.494106   54581 cri.go:89] found id: ""
	I1201 19:33:55.494123   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.494130   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:55.494135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:55.494192   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:55.523658   54581 cri.go:89] found id: ""
	I1201 19:33:55.523671   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.523678   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:55.523683   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:55.523742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:55.549084   54581 cri.go:89] found id: ""
	I1201 19:33:55.549097   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.549105   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:55.549110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:55.549171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:55.573973   54581 cri.go:89] found id: ""
	I1201 19:33:55.573986   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.573993   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:55.574001   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:55.574014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:55.629601   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:55.629618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:55.640511   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:55.640527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:55.703852   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:55.703862   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:55.703875   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:55.767135   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:55.767154   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.297608   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:58.307660   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:58.307729   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:58.331936   54581 cri.go:89] found id: ""
	I1201 19:33:58.331948   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.331955   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:58.331961   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:58.332023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:58.356515   54581 cri.go:89] found id: ""
	I1201 19:33:58.356528   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.356535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:58.356544   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:58.356601   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:58.381178   54581 cri.go:89] found id: ""
	I1201 19:33:58.381191   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.381198   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:58.381203   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:58.381259   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:58.405890   54581 cri.go:89] found id: ""
	I1201 19:33:58.405904   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.405911   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:58.405916   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:58.405971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:58.429783   54581 cri.go:89] found id: ""
	I1201 19:33:58.429796   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.429804   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:58.429809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:58.429875   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:58.454357   54581 cri.go:89] found id: ""
	I1201 19:33:58.454370   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.454377   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:58.454383   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:58.454443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:58.483382   54581 cri.go:89] found id: ""
	I1201 19:33:58.483395   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.483403   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:58.483410   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:58.483421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:58.494465   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:58.494480   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:58.557097   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:58.557108   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:58.557119   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:58.624200   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:58.624219   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.654678   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:58.654694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.213704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:01.225298   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:01.225360   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:01.251173   54581 cri.go:89] found id: ""
	I1201 19:34:01.251187   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.251194   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:01.251200   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:01.251272   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:01.278884   54581 cri.go:89] found id: ""
	I1201 19:34:01.278897   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.278904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:01.278910   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:01.278967   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:01.305393   54581 cri.go:89] found id: ""
	I1201 19:34:01.305407   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.305414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:01.305419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:01.305522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:01.331958   54581 cri.go:89] found id: ""
	I1201 19:34:01.331971   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.331978   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:01.331983   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:01.332042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:01.357701   54581 cri.go:89] found id: ""
	I1201 19:34:01.357714   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.357721   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:01.357727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:01.357786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:01.384631   54581 cri.go:89] found id: ""
	I1201 19:34:01.384645   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.384662   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:01.384668   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:01.384742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:01.410554   54581 cri.go:89] found id: ""
	I1201 19:34:01.410567   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.410574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:01.410582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:01.410591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.466596   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:01.466614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:01.477827   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:01.477843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:01.543509   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:01.543518   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:01.543529   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:01.606587   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:01.606608   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:04.136300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:04.146336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:04.146412   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:04.177880   54581 cri.go:89] found id: ""
	I1201 19:34:04.177894   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.177901   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:04.177906   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:04.177971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:04.203986   54581 cri.go:89] found id: ""
	I1201 19:34:04.203999   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.204006   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:04.204012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:04.204068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:04.228899   54581 cri.go:89] found id: ""
	I1201 19:34:04.228912   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.228920   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:04.228925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:04.228989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:04.254700   54581 cri.go:89] found id: ""
	I1201 19:34:04.254715   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.254722   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:04.254729   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:04.254788   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:04.280370   54581 cri.go:89] found id: ""
	I1201 19:34:04.280383   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.280390   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:04.280396   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:04.280453   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:04.304821   54581 cri.go:89] found id: ""
	I1201 19:34:04.304834   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.304842   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:04.304847   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:04.304910   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:04.331513   54581 cri.go:89] found id: ""
	I1201 19:34:04.331525   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.331533   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:04.331540   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:04.331550   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:04.390353   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:04.390371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:04.403182   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:04.403198   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:04.471239   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:04.471261   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:04.471273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:04.534546   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:04.534567   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:07.063925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:07.074362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:07.074427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:07.107919   54581 cri.go:89] found id: ""
	I1201 19:34:07.107933   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.107940   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:07.107946   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:07.108003   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:07.137952   54581 cri.go:89] found id: ""
	I1201 19:34:07.137965   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.137973   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:07.137978   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:07.138038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:07.172024   54581 cri.go:89] found id: ""
	I1201 19:34:07.172037   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.172044   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:07.172049   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:07.172107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:07.196732   54581 cri.go:89] found id: ""
	I1201 19:34:07.196745   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.196752   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:07.196759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:07.196814   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:07.221862   54581 cri.go:89] found id: ""
	I1201 19:34:07.221875   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.221882   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:07.221888   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:07.221947   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:07.249751   54581 cri.go:89] found id: ""
	I1201 19:34:07.249765   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.249771   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:07.249777   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:07.249833   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:07.275027   54581 cri.go:89] found id: ""
	I1201 19:34:07.275040   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.275047   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:07.275055   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:07.275065   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:07.330139   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:07.330156   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:07.341431   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:07.341447   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:07.404752   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:07.404762   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:07.404780   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:07.471227   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:07.471244   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.003255   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:10.013892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:10.013949   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:10.059011   54581 cri.go:89] found id: ""
	I1201 19:34:10.059025   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.059033   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:10.059039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:10.059101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:10.096138   54581 cri.go:89] found id: ""
	I1201 19:34:10.096152   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.096170   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:10.096177   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:10.096282   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:10.138539   54581 cri.go:89] found id: ""
	I1201 19:34:10.138600   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.138612   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:10.138618   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:10.138688   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:10.168476   54581 cri.go:89] found id: ""
	I1201 19:34:10.168490   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.168497   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:10.168502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:10.168580   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:10.194454   54581 cri.go:89] found id: ""
	I1201 19:34:10.194480   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.194487   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:10.194493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:10.194560   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:10.219419   54581 cri.go:89] found id: ""
	I1201 19:34:10.219432   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.219439   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:10.219445   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:10.219507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:10.244925   54581 cri.go:89] found id: ""
	I1201 19:34:10.244938   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.244945   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:10.244953   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:10.244964   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:10.311653   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:10.311663   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:10.311673   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:10.377857   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:10.377877   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.407833   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:10.407851   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:10.467737   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:10.467757   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:12.980376   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:12.990779   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:12.990838   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:13.016106   54581 cri.go:89] found id: ""
	I1201 19:34:13.016120   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.016127   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:13.016133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:13.016198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:13.044361   54581 cri.go:89] found id: ""
	I1201 19:34:13.044375   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.044382   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:13.044387   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:13.044444   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:13.069827   54581 cri.go:89] found id: ""
	I1201 19:34:13.069841   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.069849   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:13.069854   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:13.069913   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:13.110851   54581 cri.go:89] found id: ""
	I1201 19:34:13.110864   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.110871   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:13.110876   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:13.110933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:13.141612   54581 cri.go:89] found id: ""
	I1201 19:34:13.141626   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.141633   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:13.141638   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:13.141695   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:13.168579   54581 cri.go:89] found id: ""
	I1201 19:34:13.168592   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.168599   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:13.168604   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:13.168676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:13.194182   54581 cri.go:89] found id: ""
	I1201 19:34:13.194196   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.194204   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:13.194211   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:13.194221   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:13.255821   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:13.255840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:13.267071   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:13.267087   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:13.336403   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:13.336424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:13.336434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:13.399839   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:13.399859   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:15.930208   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:15.940605   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:15.940671   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:15.966201   54581 cri.go:89] found id: ""
	I1201 19:34:15.966215   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.966223   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:15.966228   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:15.966291   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:15.996515   54581 cri.go:89] found id: ""
	I1201 19:34:15.996528   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.996535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:15.996541   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:15.996598   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:16.022535   54581 cri.go:89] found id: ""
	I1201 19:34:16.022550   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.022564   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:16.022569   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:16.022630   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:16.057222   54581 cri.go:89] found id: ""
	I1201 19:34:16.057236   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.057246   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:16.057252   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:16.057313   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:16.087879   54581 cri.go:89] found id: ""
	I1201 19:34:16.087893   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.087900   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:16.087905   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:16.087965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:16.120946   54581 cri.go:89] found id: ""
	I1201 19:34:16.120960   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.120968   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:16.120974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:16.121035   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:16.154523   54581 cri.go:89] found id: ""
	I1201 19:34:16.154538   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.154544   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:16.154552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:16.154562   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:16.227282   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:16.227292   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:16.227303   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:16.291304   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:16.291323   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:16.320283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:16.320299   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:16.379997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:16.380014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:18.891691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:18.901502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:18.901561   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:18.926115   54581 cri.go:89] found id: ""
	I1201 19:34:18.926128   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.926135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:18.926141   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:18.926212   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:18.951977   54581 cri.go:89] found id: ""
	I1201 19:34:18.951991   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.951998   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:18.952003   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:18.952068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:18.983248   54581 cri.go:89] found id: ""
	I1201 19:34:18.983266   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.983273   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:18.983278   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:18.983342   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:19.010990   54581 cri.go:89] found id: ""
	I1201 19:34:19.011010   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.011018   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:19.011024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:19.011086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:19.036672   54581 cri.go:89] found id: ""
	I1201 19:34:19.036686   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.036693   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:19.036699   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:19.036767   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:19.061847   54581 cri.go:89] found id: ""
	I1201 19:34:19.061861   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.061868   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:19.061873   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:19.061933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:19.095496   54581 cri.go:89] found id: ""
	I1201 19:34:19.095518   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.095525   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:19.095534   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:19.095544   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:19.160188   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:19.160209   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:19.171389   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:19.171411   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:19.237242   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:19.237253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:19.237273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:19.299987   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:19.300005   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:21.834525   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:21.845009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:21.845070   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:21.869831   54581 cri.go:89] found id: ""
	I1201 19:34:21.869848   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.869855   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:21.869863   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:21.869920   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:21.894806   54581 cri.go:89] found id: ""
	I1201 19:34:21.894819   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.894826   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:21.894831   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:21.894888   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:21.919467   54581 cri.go:89] found id: ""
	I1201 19:34:21.919481   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.919489   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:21.919494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:21.919557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:21.947371   54581 cri.go:89] found id: ""
	I1201 19:34:21.947384   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.947392   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:21.947397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:21.947466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:21.972455   54581 cri.go:89] found id: ""
	I1201 19:34:21.972469   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.972488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:21.972493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:21.972551   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:21.998955   54581 cri.go:89] found id: ""
	I1201 19:34:21.998969   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.998977   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:21.998982   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:21.999044   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:22.030320   54581 cri.go:89] found id: ""
	I1201 19:34:22.030348   54581 logs.go:282] 0 containers: []
	W1201 19:34:22.030356   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:22.030365   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:22.030378   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:22.091531   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:22.091549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:22.107258   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:22.107285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:22.185420   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:22.185431   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:22.185442   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:22.250849   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:22.250866   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:24.779249   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:24.792463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:24.792522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:24.817350   54581 cri.go:89] found id: ""
	I1201 19:34:24.817364   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.817371   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:24.817377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:24.817434   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:24.842191   54581 cri.go:89] found id: ""
	I1201 19:34:24.842205   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.842218   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:24.842224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:24.842284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:24.867478   54581 cri.go:89] found id: ""
	I1201 19:34:24.867492   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.867499   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:24.867505   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:24.867576   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:24.899422   54581 cri.go:89] found id: ""
	I1201 19:34:24.899436   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.899443   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:24.899452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:24.899509   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:24.934866   54581 cri.go:89] found id: ""
	I1201 19:34:24.934880   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.934887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:24.934893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:24.934956   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:24.959270   54581 cri.go:89] found id: ""
	I1201 19:34:24.959284   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.959291   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:24.959297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:24.959362   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:24.984211   54581 cri.go:89] found id: ""
	I1201 19:34:24.984224   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.984231   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:24.984239   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:24.984259   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:25.012471   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:25.012487   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:25.072643   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:25.072660   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:25.083552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:25.083571   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:25.160495   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:25.160504   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:25.160516   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:27.727176   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:27.737246   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:27.737307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:27.761343   54581 cri.go:89] found id: ""
	I1201 19:34:27.761357   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.761364   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:27.761370   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:27.761428   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:27.786257   54581 cri.go:89] found id: ""
	I1201 19:34:27.786276   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.786283   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:27.786288   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:27.786344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:27.810779   54581 cri.go:89] found id: ""
	I1201 19:34:27.810798   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.810807   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:27.810812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:27.810874   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:27.834773   54581 cri.go:89] found id: ""
	I1201 19:34:27.834792   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.834799   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:27.834804   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:27.834860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:27.862223   54581 cri.go:89] found id: ""
	I1201 19:34:27.862241   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.862248   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:27.862253   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:27.862307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:27.887279   54581 cri.go:89] found id: ""
	I1201 19:34:27.887292   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.887299   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:27.887305   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:27.887361   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:27.910821   54581 cri.go:89] found id: ""
	I1201 19:34:27.910834   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.910842   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:27.910849   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:27.910872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:27.920894   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:27.920909   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:27.982787   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:27.982797   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:27.982808   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:28.049448   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:28.049466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:28.083298   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:28.083315   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:30.648755   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:30.659054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:30.659115   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:30.683776   54581 cri.go:89] found id: ""
	I1201 19:34:30.683790   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.683797   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:30.683802   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:30.683858   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:30.708715   54581 cri.go:89] found id: ""
	I1201 19:34:30.708729   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.708736   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:30.708741   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:30.708801   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:30.732741   54581 cri.go:89] found id: ""
	I1201 19:34:30.732754   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.732761   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:30.732767   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:30.732821   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:30.762264   54581 cri.go:89] found id: ""
	I1201 19:34:30.762278   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.762284   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:30.762290   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:30.762353   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:30.789298   54581 cri.go:89] found id: ""
	I1201 19:34:30.789312   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.789319   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:30.789324   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:30.789381   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:30.814068   54581 cri.go:89] found id: ""
	I1201 19:34:30.814081   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.814089   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:30.814095   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:30.814157   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:30.841381   54581 cri.go:89] found id: ""
	I1201 19:34:30.841394   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.841402   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:30.841409   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:30.841431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:30.902920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:30.902931   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:30.902943   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:30.965009   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:30.965026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:30.993347   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:30.993370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:31.049258   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:31.049275   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.560996   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:33.571497   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:33.571557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:33.596875   54581 cri.go:89] found id: ""
	I1201 19:34:33.596889   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.596896   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:33.596901   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:33.596960   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:33.623640   54581 cri.go:89] found id: ""
	I1201 19:34:33.623653   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.623659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:33.623664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:33.623725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:33.647792   54581 cri.go:89] found id: ""
	I1201 19:34:33.647806   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.647814   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:33.647819   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:33.647882   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:33.672114   54581 cri.go:89] found id: ""
	I1201 19:34:33.672127   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.672134   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:33.672139   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:33.672197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:33.704799   54581 cri.go:89] found id: ""
	I1201 19:34:33.704812   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.704820   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:33.704825   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:33.704885   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:33.728981   54581 cri.go:89] found id: ""
	I1201 19:34:33.728995   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.729001   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:33.729006   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:33.729063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:33.756005   54581 cri.go:89] found id: ""
	I1201 19:34:33.756019   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.756027   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:33.756035   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:33.756046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:33.788420   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:33.788437   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:33.848036   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:33.848054   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.858909   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:33.858925   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:33.921156   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:33.921167   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:33.921178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.484434   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:36.494616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:36.494679   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:36.520017   54581 cri.go:89] found id: ""
	I1201 19:34:36.520031   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.520038   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:36.520044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:36.520100   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:36.545876   54581 cri.go:89] found id: ""
	I1201 19:34:36.545890   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.545897   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:36.545903   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:36.545966   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:36.571571   54581 cri.go:89] found id: ""
	I1201 19:34:36.571584   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.571591   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:36.571596   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:36.571653   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:36.596997   54581 cri.go:89] found id: ""
	I1201 19:34:36.597012   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.597019   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:36.597024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:36.597101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:36.623469   54581 cri.go:89] found id: ""
	I1201 19:34:36.623483   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.623491   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:36.623496   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:36.623556   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:36.651811   54581 cri.go:89] found id: ""
	I1201 19:34:36.651824   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.651831   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:36.651837   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:36.651893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:36.676659   54581 cri.go:89] found id: ""
	I1201 19:34:36.676673   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.676680   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:36.676688   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:36.676697   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:36.732392   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:36.732410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:36.743384   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:36.743400   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:36.805329   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:36.805338   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:36.805349   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.867566   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:36.867584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:39.402157   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:39.412161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:39.412220   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:39.439366   54581 cri.go:89] found id: ""
	I1201 19:34:39.439380   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.439387   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:39.439392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:39.439451   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:39.464076   54581 cri.go:89] found id: ""
	I1201 19:34:39.464090   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.464097   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:39.464108   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:39.464171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:39.488248   54581 cri.go:89] found id: ""
	I1201 19:34:39.488262   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.488270   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:39.488275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:39.488331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:39.517302   54581 cri.go:89] found id: ""
	I1201 19:34:39.517315   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.517322   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:39.517328   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:39.517385   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:39.542966   54581 cri.go:89] found id: ""
	I1201 19:34:39.542980   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.542986   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:39.542992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:39.543051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:39.568903   54581 cri.go:89] found id: ""
	I1201 19:34:39.568917   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.568924   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:39.568929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:39.568990   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:39.594057   54581 cri.go:89] found id: ""
	I1201 19:34:39.594069   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.594076   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:39.594084   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:39.594093   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:39.649679   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:39.649698   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:39.660114   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:39.660133   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:39.725472   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:39.725500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:39.725512   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:39.793738   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:39.793756   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:42.322742   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:42.333451   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:42.333536   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:42.368118   54581 cri.go:89] found id: ""
	I1201 19:34:42.368132   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.368139   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:42.368146   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:42.368217   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:42.402172   54581 cri.go:89] found id: ""
	I1201 19:34:42.402186   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.402193   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:42.402198   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:42.402266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:42.426759   54581 cri.go:89] found id: ""
	I1201 19:34:42.426772   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.426780   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:42.426785   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:42.426842   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:42.466084   54581 cri.go:89] found id: ""
	I1201 19:34:42.466097   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.466105   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:42.466110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:42.466168   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:42.490814   54581 cri.go:89] found id: ""
	I1201 19:34:42.490828   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.490835   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:42.490841   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:42.490899   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:42.516557   54581 cri.go:89] found id: ""
	I1201 19:34:42.516570   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.516578   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:42.516583   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:42.516651   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:42.542203   54581 cri.go:89] found id: ""
	I1201 19:34:42.542218   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.542224   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:42.542233   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:42.542243   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:42.599254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:42.599272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:42.610313   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:42.610328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:42.677502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:42.677514   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:42.677527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:42.751656   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:42.751683   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.281764   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:45.295929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:45.296004   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:45.355982   54581 cri.go:89] found id: ""
	I1201 19:34:45.356019   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.356027   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:45.356043   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:45.356214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:45.395974   54581 cri.go:89] found id: ""
	I1201 19:34:45.395987   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.396003   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:45.396008   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:45.396064   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:45.425011   54581 cri.go:89] found id: ""
	I1201 19:34:45.425027   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.425035   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:45.425041   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:45.425175   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:45.450304   54581 cri.go:89] found id: ""
	I1201 19:34:45.450317   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.450325   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:45.450330   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:45.450399   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:45.480282   54581 cri.go:89] found id: ""
	I1201 19:34:45.480296   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.480302   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:45.480307   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:45.480376   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:45.511012   54581 cri.go:89] found id: ""
	I1201 19:34:45.511026   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.511033   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:45.511039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:45.511101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:45.536767   54581 cri.go:89] found id: ""
	I1201 19:34:45.536781   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.536797   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:45.536806   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:45.536818   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:45.547801   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:45.547822   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:45.615408   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:45.615424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:45.615434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:45.679022   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:45.679041   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.711030   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:45.711049   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.268349   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:48.279339   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:48.279398   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:48.304817   54581 cri.go:89] found id: ""
	I1201 19:34:48.304831   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.304839   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:48.304844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:48.304905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:48.329897   54581 cri.go:89] found id: ""
	I1201 19:34:48.329911   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.329919   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:48.329924   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:48.329982   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:48.369087   54581 cri.go:89] found id: ""
	I1201 19:34:48.369100   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.369107   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:48.369112   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:48.369169   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:48.400882   54581 cri.go:89] found id: ""
	I1201 19:34:48.400896   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.400903   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:48.400909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:48.400965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:48.426896   54581 cri.go:89] found id: ""
	I1201 19:34:48.426912   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.426920   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:48.426925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:48.426987   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:48.455956   54581 cri.go:89] found id: ""
	I1201 19:34:48.455969   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.455987   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:48.455994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:48.456051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:48.480640   54581 cri.go:89] found id: ""
	I1201 19:34:48.480653   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.480671   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:48.480679   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:48.480690   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.536591   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:48.536609   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:48.547466   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:48.547482   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:48.620325   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:48.620335   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:48.620345   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:48.683388   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:48.683407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:51.214144   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:51.224292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:51.224364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:51.247923   54581 cri.go:89] found id: ""
	I1201 19:34:51.247937   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.247945   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:51.247952   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:51.248011   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:51.273984   54581 cri.go:89] found id: ""
	I1201 19:34:51.273998   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.274005   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:51.274011   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:51.274072   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:51.298775   54581 cri.go:89] found id: ""
	I1201 19:34:51.298789   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.298796   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:51.298801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:51.298860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:51.326553   54581 cri.go:89] found id: ""
	I1201 19:34:51.326567   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.326574   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:51.326580   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:51.326639   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:51.360945   54581 cri.go:89] found id: ""
	I1201 19:34:51.360959   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.360987   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:51.360992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:51.361059   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:51.396255   54581 cri.go:89] found id: ""
	I1201 19:34:51.396282   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.396290   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:51.396296   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:51.396369   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:51.427687   54581 cri.go:89] found id: ""
	I1201 19:34:51.427700   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.427707   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:51.427715   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:51.427734   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:51.483915   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:51.483934   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:51.495247   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:51.495271   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:51.559547   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:51.559558   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:51.559568   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:51.623141   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:51.623161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:54.157001   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:54.170439   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:54.170498   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:54.203772   54581 cri.go:89] found id: ""
	I1201 19:34:54.203785   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.203792   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:54.203798   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:54.203854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:54.231733   54581 cri.go:89] found id: ""
	I1201 19:34:54.231747   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.231754   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:54.231759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:54.231817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:54.256716   54581 cri.go:89] found id: ""
	I1201 19:34:54.256739   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.256746   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:54.256752   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:54.256817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:54.281376   54581 cri.go:89] found id: ""
	I1201 19:34:54.281390   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.281407   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:54.281413   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:54.281469   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:54.305969   54581 cri.go:89] found id: ""
	I1201 19:34:54.305982   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.305989   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:54.305994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:54.306049   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:54.330385   54581 cri.go:89] found id: ""
	I1201 19:34:54.330399   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.330406   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:54.330422   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:54.330478   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:54.358455   54581 cri.go:89] found id: ""
	I1201 19:34:54.358478   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.358489   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:54.358497   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:54.358508   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:54.422783   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:54.422804   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:54.434139   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:54.434153   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:54.499665   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:54.499677   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:54.499689   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:54.562594   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:54.562614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:57.093944   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:57.104140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:57.104207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:57.129578   54581 cri.go:89] found id: ""
	I1201 19:34:57.129590   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.129597   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:57.129603   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:57.129663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:57.153119   54581 cri.go:89] found id: ""
	I1201 19:34:57.153133   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.153140   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:57.153145   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:57.153202   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:57.178134   54581 cri.go:89] found id: ""
	I1201 19:34:57.178148   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.178155   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:57.178161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:57.178222   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:57.208559   54581 cri.go:89] found id: ""
	I1201 19:34:57.208572   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.208579   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:57.208585   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:57.208642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:57.232807   54581 cri.go:89] found id: ""
	I1201 19:34:57.232821   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.232838   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:57.232844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:57.232898   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:57.257939   54581 cri.go:89] found id: ""
	I1201 19:34:57.257952   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.257959   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:57.257964   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:57.258022   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:57.283855   54581 cri.go:89] found id: ""
	I1201 19:34:57.283869   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.283875   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:57.283883   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:57.283893   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:57.340764   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:57.340781   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:57.352935   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:57.352949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:57.427562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:57.427571   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:57.427581   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:57.490526   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:57.490553   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.020694   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:00.036199   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:00.036266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:00.146207   54581 cri.go:89] found id: ""
	I1201 19:35:00.146226   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.146234   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:00.146241   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:00.146319   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:00.271439   54581 cri.go:89] found id: ""
	I1201 19:35:00.271454   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.271462   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:00.271468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:00.271541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:00.365096   54581 cri.go:89] found id: ""
	I1201 19:35:00.365111   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.365119   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:00.365124   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:00.365190   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:00.419095   54581 cri.go:89] found id: ""
	I1201 19:35:00.419109   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.419116   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:00.419123   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:00.419184   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:00.457455   54581 cri.go:89] found id: ""
	I1201 19:35:00.457470   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.457478   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:00.457507   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:00.457577   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:00.503679   54581 cri.go:89] found id: ""
	I1201 19:35:00.503694   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.503701   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:00.503710   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:00.503803   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:00.546120   54581 cri.go:89] found id: ""
	I1201 19:35:00.546135   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.546142   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:00.546151   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:00.546164   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:00.559836   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:00.559853   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:00.634650   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:00.634660   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:00.634675   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:00.700259   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:00.700278   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.733345   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:00.733363   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.295407   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:03.305664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:03.305725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:03.330370   54581 cri.go:89] found id: ""
	I1201 19:35:03.330385   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.330392   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:03.330397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:03.330452   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:03.356109   54581 cri.go:89] found id: ""
	I1201 19:35:03.356123   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.356130   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:03.356135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:03.356198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:03.382338   54581 cri.go:89] found id: ""
	I1201 19:35:03.382352   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.382360   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:03.382366   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:03.382423   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:03.414550   54581 cri.go:89] found id: ""
	I1201 19:35:03.414564   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.414571   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:03.414577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:03.414633   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:03.438540   54581 cri.go:89] found id: ""
	I1201 19:35:03.438553   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.438560   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:03.438565   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:03.438623   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:03.463113   54581 cri.go:89] found id: ""
	I1201 19:35:03.463127   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.463134   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:03.463140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:03.463204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:03.487632   54581 cri.go:89] found id: ""
	I1201 19:35:03.487645   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.487653   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:03.487660   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:03.487670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.544515   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:03.544536   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:03.555787   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:03.555803   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:03.627256   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:03.627266   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:03.627276   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:03.691235   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:03.691254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.220125   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:06.230749   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:06.230813   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:06.255951   54581 cri.go:89] found id: ""
	I1201 19:35:06.255965   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.255972   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:06.255977   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:06.256034   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:06.281528   54581 cri.go:89] found id: ""
	I1201 19:35:06.281542   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.281549   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:06.281554   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:06.281613   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:06.306502   54581 cri.go:89] found id: ""
	I1201 19:35:06.306515   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.306522   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:06.306527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:06.306590   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:06.337726   54581 cri.go:89] found id: ""
	I1201 19:35:06.337739   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.337745   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:06.337751   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:06.337810   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:06.367682   54581 cri.go:89] found id: ""
	I1201 19:35:06.367696   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.367713   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:06.367726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:06.367793   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:06.397675   54581 cri.go:89] found id: ""
	I1201 19:35:06.397690   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.397707   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:06.397713   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:06.397778   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:06.424426   54581 cri.go:89] found id: ""
	I1201 19:35:06.424439   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.424452   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:06.424460   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:06.424471   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:06.435325   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:06.435340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:06.499920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:06.499942   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:06.499952   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:06.564348   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:06.564367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.592906   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:06.592921   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:09.151061   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:09.161179   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:09.161240   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:09.186739   54581 cri.go:89] found id: ""
	I1201 19:35:09.186752   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.186759   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:09.186765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:09.186822   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:09.211245   54581 cri.go:89] found id: ""
	I1201 19:35:09.211259   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.211267   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:09.211273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:09.211336   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:09.239043   54581 cri.go:89] found id: ""
	I1201 19:35:09.239056   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.239063   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:09.239068   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:09.239125   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:09.264055   54581 cri.go:89] found id: ""
	I1201 19:35:09.264068   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.264076   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:09.264081   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:09.264137   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:09.288509   54581 cri.go:89] found id: ""
	I1201 19:35:09.288522   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.288529   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:09.288536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:09.288593   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:09.312763   54581 cri.go:89] found id: ""
	I1201 19:35:09.312777   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.312784   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:09.312789   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:09.312851   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:09.344164   54581 cri.go:89] found id: ""
	I1201 19:35:09.344177   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.344184   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:09.344192   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:09.344203   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:09.356120   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:09.356134   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:09.428320   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:09.428329   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:09.428339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:09.491282   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:09.491301   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:09.518473   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:09.518488   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.081815   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:12.092336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:12.092400   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:12.117269   54581 cri.go:89] found id: ""
	I1201 19:35:12.117284   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.117291   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:12.117297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:12.117355   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:12.141885   54581 cri.go:89] found id: ""
	I1201 19:35:12.141898   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.141904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:12.141909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:12.141968   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:12.166386   54581 cri.go:89] found id: ""
	I1201 19:35:12.166400   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.166407   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:12.166411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:12.166479   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:12.190615   54581 cri.go:89] found id: ""
	I1201 19:35:12.190628   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.190636   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:12.190641   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:12.190701   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:12.219887   54581 cri.go:89] found id: ""
	I1201 19:35:12.219900   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.219907   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:12.219912   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:12.219970   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:12.244718   54581 cri.go:89] found id: ""
	I1201 19:35:12.244731   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.244738   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:12.244743   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:12.244802   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:12.272273   54581 cri.go:89] found id: ""
	I1201 19:35:12.272287   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.272294   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:12.272301   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:12.272312   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.329315   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:12.329334   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:12.343015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:12.343032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:12.419939   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:12.419949   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:12.419960   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:12.482187   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:12.482205   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:15.011802   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:15.022432   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:15.022499   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:15.094886   54581 cri.go:89] found id: ""
	I1201 19:35:15.094901   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.094909   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:15.094915   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:15.094978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:15.120839   54581 cri.go:89] found id: ""
	I1201 19:35:15.120853   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.120860   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:15.120865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:15.120927   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:15.150767   54581 cri.go:89] found id: ""
	I1201 19:35:15.150781   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.150795   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:15.150801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:15.150867   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:15.177630   54581 cri.go:89] found id: ""
	I1201 19:35:15.177644   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.177651   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:15.177656   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:15.177727   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:15.203467   54581 cri.go:89] found id: ""
	I1201 19:35:15.203480   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.203498   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:15.203504   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:15.203563   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:15.229010   54581 cri.go:89] found id: ""
	I1201 19:35:15.229023   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.229031   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:15.229036   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:15.229128   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:15.254029   54581 cri.go:89] found id: ""
	I1201 19:35:15.254043   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.254051   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:15.254058   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:15.254068   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:15.309931   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:15.309949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:15.320452   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:15.320466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:15.413158   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:15.413169   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:15.413180   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:15.475409   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:15.475428   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:18.004450   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:18.015126   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:18.015185   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:18.046344   54581 cri.go:89] found id: ""
	I1201 19:35:18.046359   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.046366   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:18.046373   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:18.046436   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:18.074519   54581 cri.go:89] found id: ""
	I1201 19:35:18.074532   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.074539   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:18.074545   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:18.074603   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:18.103787   54581 cri.go:89] found id: ""
	I1201 19:35:18.103801   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.103808   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:18.103814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:18.103869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:18.130363   54581 cri.go:89] found id: ""
	I1201 19:35:18.130377   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.130384   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:18.130390   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:18.130449   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:18.155589   54581 cri.go:89] found id: ""
	I1201 19:35:18.155616   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.155625   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:18.155630   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:18.155699   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:18.180628   54581 cri.go:89] found id: ""
	I1201 19:35:18.180641   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.180648   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:18.180654   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:18.180711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:18.205996   54581 cri.go:89] found id: ""
	I1201 19:35:18.206026   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.206033   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:18.206041   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:18.206051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:18.260718   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:18.260736   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:18.271842   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:18.271858   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:18.342769   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:18.342780   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:18.342793   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:18.423726   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:18.423744   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:20.954199   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:20.964087   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:20.964143   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:20.987490   54581 cri.go:89] found id: ""
	I1201 19:35:20.987504   54581 logs.go:282] 0 containers: []
	W1201 19:35:20.987510   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:20.987516   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:20.987572   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:21.012114   54581 cri.go:89] found id: ""
	I1201 19:35:21.012128   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.012135   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:21.012140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:21.012201   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:21.037730   54581 cri.go:89] found id: ""
	I1201 19:35:21.037744   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.037751   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:21.037756   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:21.037815   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:21.062445   54581 cri.go:89] found id: ""
	I1201 19:35:21.062458   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.062465   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:21.062471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:21.062529   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:21.086847   54581 cri.go:89] found id: ""
	I1201 19:35:21.086860   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.086867   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:21.086872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:21.086930   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:21.111866   54581 cri.go:89] found id: ""
	I1201 19:35:21.111880   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.111886   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:21.111892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:21.111948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:21.136296   54581 cri.go:89] found id: ""
	I1201 19:35:21.136311   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.136318   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:21.136326   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:21.136343   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:21.200999   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:21.201009   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:21.201020   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:21.265838   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:21.265857   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:21.296214   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:21.296230   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:21.354254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:21.354272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:23.868647   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:23.879143   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:23.879205   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:23.907613   54581 cri.go:89] found id: ""
	I1201 19:35:23.907633   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.907640   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:23.907645   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:23.907705   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:23.932767   54581 cri.go:89] found id: ""
	I1201 19:35:23.932781   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.932787   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:23.932793   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:23.932849   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:23.961305   54581 cri.go:89] found id: ""
	I1201 19:35:23.961319   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.961326   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:23.961331   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:23.961387   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:23.986651   54581 cri.go:89] found id: ""
	I1201 19:35:23.986664   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.986670   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:23.986676   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:23.986734   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:24.011204   54581 cri.go:89] found id: ""
	I1201 19:35:24.011218   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.011225   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:24.011230   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:24.011286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:24.040784   54581 cri.go:89] found id: ""
	I1201 19:35:24.040798   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.040806   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:24.040812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:24.040871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:24.067432   54581 cri.go:89] found id: ""
	I1201 19:35:24.067446   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.067453   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:24.067461   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:24.067472   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:24.132929   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:24.132946   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:24.132956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:24.194894   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:24.194912   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:24.225351   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:24.225366   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:24.282142   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:24.282161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:26.793143   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:26.803454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:26.803518   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:26.827433   54581 cri.go:89] found id: ""
	I1201 19:35:26.827447   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.827454   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:26.827459   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:26.827514   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:26.851666   54581 cri.go:89] found id: ""
	I1201 19:35:26.851680   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.851686   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:26.851691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:26.851749   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:26.880353   54581 cri.go:89] found id: ""
	I1201 19:35:26.880367   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.880374   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:26.880379   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:26.880437   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:26.908944   54581 cri.go:89] found id: ""
	I1201 19:35:26.908957   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.908964   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:26.908969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:26.909025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:26.933983   54581 cri.go:89] found id: ""
	I1201 19:35:26.933996   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.934003   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:26.934009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:26.934069   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:26.958791   54581 cri.go:89] found id: ""
	I1201 19:35:26.958805   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.958812   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:26.958818   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:26.958878   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:26.983156   54581 cri.go:89] found id: ""
	I1201 19:35:26.983170   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.983177   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:26.983185   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:26.983200   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:27.038997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:27.039015   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:27.050299   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:27.050314   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:27.113733   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:27.113744   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:27.113754   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:27.176267   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:27.176285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.706128   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:29.716285   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:29.716344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:29.741420   54581 cri.go:89] found id: ""
	I1201 19:35:29.741435   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.741442   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:29.741447   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:29.741545   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:29.766524   54581 cri.go:89] found id: ""
	I1201 19:35:29.766538   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.766545   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:29.766550   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:29.766616   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:29.795421   54581 cri.go:89] found id: ""
	I1201 19:35:29.795434   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.795441   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:29.795446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:29.795511   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:29.821121   54581 cri.go:89] found id: ""
	I1201 19:35:29.821135   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.821142   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:29.821147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:29.821204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:29.849641   54581 cri.go:89] found id: ""
	I1201 19:35:29.849654   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.849662   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:29.849667   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:29.849724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:29.874049   54581 cri.go:89] found id: ""
	I1201 19:35:29.874063   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.874069   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:29.874075   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:29.874136   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:29.897867   54581 cri.go:89] found id: ""
	I1201 19:35:29.897880   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.897887   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:29.897895   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:29.897905   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:29.959029   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:29.959046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.991283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:29.991298   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:30.051265   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:30.051286   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:30.082322   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:30.082339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:30.173300   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:32.673672   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:32.683965   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:32.684023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:32.712191   54581 cri.go:89] found id: ""
	I1201 19:35:32.712204   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.712211   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:32.712216   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:32.712275   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:32.739246   54581 cri.go:89] found id: ""
	I1201 19:35:32.739259   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.739266   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:32.739272   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:32.739331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:32.763898   54581 cri.go:89] found id: ""
	I1201 19:35:32.763911   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.763924   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:32.763929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:32.763989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:32.789967   54581 cri.go:89] found id: ""
	I1201 19:35:32.789990   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.789997   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:32.790004   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:32.790063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:32.816013   54581 cri.go:89] found id: ""
	I1201 19:35:32.816028   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.816035   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:32.816040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:32.816098   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:32.839560   54581 cri.go:89] found id: ""
	I1201 19:35:32.839573   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.839580   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:32.839586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:32.839644   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:32.868062   54581 cri.go:89] found id: ""
	I1201 19:35:32.868075   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.868082   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:32.868090   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:32.868099   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:32.923266   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:32.923285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:32.934015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:32.934030   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:33.005502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:33.005512   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:33.005523   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:33.075965   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:33.075984   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.605628   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:35.617054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:35.617126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:35.646998   54581 cri.go:89] found id: ""
	I1201 19:35:35.647012   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.647019   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:35.647025   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:35.647086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:35.676130   54581 cri.go:89] found id: ""
	I1201 19:35:35.676143   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.676150   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:35.676155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:35.676211   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:35.700589   54581 cri.go:89] found id: ""
	I1201 19:35:35.700602   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.700609   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:35.700616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:35.700672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:35.725233   54581 cri.go:89] found id: ""
	I1201 19:35:35.725246   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.725253   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:35.725273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:35.725343   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:35.750382   54581 cri.go:89] found id: ""
	I1201 19:35:35.750396   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.750403   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:35.750408   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:35.750462   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:35.775219   54581 cri.go:89] found id: ""
	I1201 19:35:35.775235   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.775243   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:35.775248   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:35.775320   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:35.800831   54581 cri.go:89] found id: ""
	I1201 19:35:35.800845   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.800852   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:35.800859   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:35.800870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:35.866740   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:35.866756   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:35.866767   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:35.931013   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:35.931031   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.958721   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:35.958743   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:36.015847   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:36.015863   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.535518   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:38.545931   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:38.545993   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:38.571083   54581 cri.go:89] found id: ""
	I1201 19:35:38.571097   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.571104   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:38.571109   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:38.571170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:38.608738   54581 cri.go:89] found id: ""
	I1201 19:35:38.608752   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.608759   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:38.608765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:38.608820   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:38.635605   54581 cri.go:89] found id: ""
	I1201 19:35:38.635619   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.635626   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:38.635631   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:38.635689   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:38.668134   54581 cri.go:89] found id: ""
	I1201 19:35:38.668147   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.668155   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:38.668172   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:38.668231   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:38.693505   54581 cri.go:89] found id: ""
	I1201 19:35:38.693519   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.693526   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:38.693531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:38.693602   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:38.719017   54581 cri.go:89] found id: ""
	I1201 19:35:38.719031   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.719039   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:38.719044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:38.719103   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:38.748727   54581 cri.go:89] found id: ""
	I1201 19:35:38.748740   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.748747   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:38.748754   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:38.748765   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:38.778021   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:38.778037   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:38.838504   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:38.838524   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.851587   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:38.851603   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:38.919080   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:38.919115   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:38.919130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.484602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:41.495239   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:41.495298   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:41.525151   54581 cri.go:89] found id: ""
	I1201 19:35:41.525165   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.525172   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:41.525191   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:41.525256   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:41.551287   54581 cri.go:89] found id: ""
	I1201 19:35:41.551301   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.551309   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:41.551329   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:41.551392   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:41.577108   54581 cri.go:89] found id: ""
	I1201 19:35:41.577124   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.577131   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:41.577136   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:41.577204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:41.613970   54581 cri.go:89] found id: ""
	I1201 19:35:41.613983   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.613991   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:41.614005   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:41.614063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:41.647948   54581 cri.go:89] found id: ""
	I1201 19:35:41.647961   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.647968   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:41.647973   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:41.648038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:41.675741   54581 cri.go:89] found id: ""
	I1201 19:35:41.675754   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.675761   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:41.675770   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:41.675827   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:41.701031   54581 cri.go:89] found id: ""
	I1201 19:35:41.701053   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.701061   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:41.701068   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:41.701079   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:41.712066   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:41.712081   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:41.774820   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:41.774852   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:41.774864   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.837237   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:41.837254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:41.867407   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:41.867423   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.425417   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:44.436694   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:44.436764   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:44.462550   54581 cri.go:89] found id: ""
	I1201 19:35:44.462565   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.462571   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:44.462577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:44.462634   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:44.490237   54581 cri.go:89] found id: ""
	I1201 19:35:44.490250   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.490257   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:44.490262   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:44.490318   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:44.517417   54581 cri.go:89] found id: ""
	I1201 19:35:44.517431   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.517438   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:44.517443   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:44.517523   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:44.542502   54581 cri.go:89] found id: ""
	I1201 19:35:44.542516   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.542523   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:44.542528   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:44.542588   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:44.568636   54581 cri.go:89] found id: ""
	I1201 19:35:44.568650   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.568682   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:44.568688   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:44.568756   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:44.602872   54581 cri.go:89] found id: ""
	I1201 19:35:44.602891   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.602898   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:44.602904   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:44.602961   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:44.633265   54581 cri.go:89] found id: ""
	I1201 19:35:44.633280   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.633287   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:44.633295   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:44.633305   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:44.704029   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:44.704040   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:44.704051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:44.768055   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:44.768075   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:44.797083   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:44.797098   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.852537   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:44.852555   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.364630   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:47.374921   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:47.374978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:47.399587   54581 cri.go:89] found id: ""
	I1201 19:35:47.399600   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.399607   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:47.399613   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:47.399672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:47.426120   54581 cri.go:89] found id: ""
	I1201 19:35:47.426134   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.426141   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:47.426147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:47.426227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:47.457662   54581 cri.go:89] found id: ""
	I1201 19:35:47.457676   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.457683   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:47.457689   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:47.457747   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:47.482682   54581 cri.go:89] found id: ""
	I1201 19:35:47.482702   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.482709   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:47.482728   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:47.482796   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:47.511319   54581 cri.go:89] found id: ""
	I1201 19:35:47.511334   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.511341   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:47.511346   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:47.511409   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:47.543730   54581 cri.go:89] found id: ""
	I1201 19:35:47.543742   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.543760   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:47.543765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:47.543831   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:47.572333   54581 cri.go:89] found id: ""
	I1201 19:35:47.572347   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.572355   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:47.572363   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:47.572385   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:47.637165   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:47.637184   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.648940   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:47.648956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:47.711651   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:47.711662   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:47.711681   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:47.773144   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:47.773163   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:50.303086   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:50.313234   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:50.313293   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:50.337483   54581 cri.go:89] found id: ""
	I1201 19:35:50.337515   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.337522   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:50.337527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:50.337583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:50.363911   54581 cri.go:89] found id: ""
	I1201 19:35:50.363927   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.363934   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:50.363939   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:50.363994   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:50.388359   54581 cri.go:89] found id: ""
	I1201 19:35:50.388373   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.388380   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:50.388386   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:50.388441   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:50.412983   54581 cri.go:89] found id: ""
	I1201 19:35:50.412996   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.413003   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:50.413014   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:50.413073   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:50.440996   54581 cri.go:89] found id: ""
	I1201 19:35:50.441017   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.441024   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:50.441030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:50.441085   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:50.467480   54581 cri.go:89] found id: ""
	I1201 19:35:50.467493   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.467501   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:50.467506   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:50.467567   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:50.494388   54581 cri.go:89] found id: ""
	I1201 19:35:50.494402   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.494409   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:50.494416   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:50.494427   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:50.550339   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:50.550359   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:50.561242   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:50.561258   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:50.633849   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:50.633860   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:50.633870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:50.702260   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:50.702280   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:53.234959   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:53.245018   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:53.245083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:53.276399   54581 cri.go:89] found id: ""
	I1201 19:35:53.276413   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.276420   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:53.276425   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:53.276491   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:53.305853   54581 cri.go:89] found id: ""
	I1201 19:35:53.305866   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.305873   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:53.305878   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:53.305935   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:53.335241   54581 cri.go:89] found id: ""
	I1201 19:35:53.335255   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.335263   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:53.335269   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:53.335328   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:53.359467   54581 cri.go:89] found id: ""
	I1201 19:35:53.359481   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.359488   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:53.359493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:53.359550   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:53.384120   54581 cri.go:89] found id: ""
	I1201 19:35:53.384134   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.384141   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:53.384147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:53.384203   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:53.414128   54581 cri.go:89] found id: ""
	I1201 19:35:53.414141   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.414149   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:53.414155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:53.414214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:53.439408   54581 cri.go:89] found id: ""
	I1201 19:35:53.439421   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.439428   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:53.439436   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:53.439446   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:53.495007   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:53.495026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:53.505932   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:53.505948   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:53.572678   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:53.572688   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:53.572702   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:53.650600   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:53.650621   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:56.183319   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:56.193782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:56.193843   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:56.224114   54581 cri.go:89] found id: ""
	I1201 19:35:56.224128   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.224135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:56.224140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:56.224197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:56.254013   54581 cri.go:89] found id: ""
	I1201 19:35:56.254027   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.254034   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:56.254040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:56.254102   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:56.279886   54581 cri.go:89] found id: ""
	I1201 19:35:56.279900   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.279908   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:56.279914   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:56.279976   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:56.304943   54581 cri.go:89] found id: ""
	I1201 19:35:56.304956   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.304963   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:56.304969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:56.305025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:56.328633   54581 cri.go:89] found id: ""
	I1201 19:35:56.328647   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.328654   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:56.328659   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:56.328715   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:56.357255   54581 cri.go:89] found id: ""
	I1201 19:35:56.357269   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.357276   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:56.357281   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:56.357340   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:56.381420   54581 cri.go:89] found id: ""
	I1201 19:35:56.381434   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.381441   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:56.381449   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:56.381459   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:56.439709   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:56.439728   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:56.450590   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:56.450605   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:56.516412   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:56.516423   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:56.516435   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:56.577800   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:56.577828   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.114477   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:59.124117   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:59.124179   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:59.151351   54581 cri.go:89] found id: ""
	I1201 19:35:59.151364   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.151372   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:59.151377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:59.151433   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:59.179997   54581 cri.go:89] found id: ""
	I1201 19:35:59.180010   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.180017   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:59.180022   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:59.180084   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:59.204818   54581 cri.go:89] found id: ""
	I1201 19:35:59.204832   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.204859   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:59.204864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:59.204923   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:59.230443   54581 cri.go:89] found id: ""
	I1201 19:35:59.230456   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.230464   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:59.230470   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:59.230524   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:59.254548   54581 cri.go:89] found id: ""
	I1201 19:35:59.254561   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.254569   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:59.254574   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:59.254629   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:59.282564   54581 cri.go:89] found id: ""
	I1201 19:35:59.282577   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.282584   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:59.282590   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:59.282645   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:59.310544   54581 cri.go:89] found id: ""
	I1201 19:35:59.310557   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.310565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:59.310573   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:59.310587   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:59.377012   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:59.377021   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:59.377032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:59.441479   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:59.441511   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.471908   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:59.471924   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:59.527613   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:59.527631   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.040294   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:02.051787   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:02.051869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:02.077788   54581 cri.go:89] found id: ""
	I1201 19:36:02.077801   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.077808   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:02.077814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:02.077871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:02.103346   54581 cri.go:89] found id: ""
	I1201 19:36:02.103359   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.103366   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:02.103371   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:02.103427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:02.128949   54581 cri.go:89] found id: ""
	I1201 19:36:02.128963   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.128970   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:02.128975   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:02.129033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:02.153585   54581 cri.go:89] found id: ""
	I1201 19:36:02.153598   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.153605   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:02.153611   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:02.153668   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:02.180499   54581 cri.go:89] found id: ""
	I1201 19:36:02.180513   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.180520   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:02.180531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:02.180592   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:02.206116   54581 cri.go:89] found id: ""
	I1201 19:36:02.206131   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.206138   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:02.206144   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:02.206210   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:02.232470   54581 cri.go:89] found id: ""
	I1201 19:36:02.232484   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.232492   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:02.232500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:02.232513   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:02.295347   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:02.295367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:02.323002   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:02.323018   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:02.382028   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:02.382046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.393159   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:02.393176   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:02.457522   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:04.957729   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:04.967951   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:04.968012   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:04.993754   54581 cri.go:89] found id: ""
	I1201 19:36:04.993769   54581 logs.go:282] 0 containers: []
	W1201 19:36:04.993776   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:04.993782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:04.993844   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:05.019859   54581 cri.go:89] found id: ""
	I1201 19:36:05.019873   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.019881   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:05.019886   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:05.019943   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:05.047016   54581 cri.go:89] found id: ""
	I1201 19:36:05.047031   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.047038   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:05.047046   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:05.047107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:05.072292   54581 cri.go:89] found id: ""
	I1201 19:36:05.072306   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.072313   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:05.072318   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:05.072377   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:05.099842   54581 cri.go:89] found id: ""
	I1201 19:36:05.099857   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.099864   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:05.099870   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:05.099926   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:05.125552   54581 cri.go:89] found id: ""
	I1201 19:36:05.125566   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.125573   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:05.125579   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:05.125635   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:05.150637   54581 cri.go:89] found id: ""
	I1201 19:36:05.150651   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.150659   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:05.150667   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:05.150677   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:05.218391   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:05.218410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:05.246651   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:05.246670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:05.303677   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:05.303694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:05.314794   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:05.314809   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:05.380077   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:07.881622   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:07.893048   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:07.893109   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:07.918109   54581 cri.go:89] found id: ""
	I1201 19:36:07.918122   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.918129   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:07.918134   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:07.918196   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:07.943504   54581 cri.go:89] found id: ""
	I1201 19:36:07.943518   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.943525   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:07.943536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:07.943595   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:07.969943   54581 cri.go:89] found id: ""
	I1201 19:36:07.969958   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.969965   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:07.969971   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:07.970033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:07.994994   54581 cri.go:89] found id: ""
	I1201 19:36:07.995009   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.995015   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:07.995021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:07.995083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:08.020591   54581 cri.go:89] found id: ""
	I1201 19:36:08.020605   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.020612   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:08.020617   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:08.020676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:08.053041   54581 cri.go:89] found id: ""
	I1201 19:36:08.053056   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.053063   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:08.053069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:08.053129   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:08.084333   54581 cri.go:89] found id: ""
	I1201 19:36:08.084346   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.084353   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:08.084361   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:08.084371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:08.099534   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:08.099551   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:08.163985   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:08.163995   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:08.164006   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:08.224823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:08.224840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:08.256602   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:08.256618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:10.818842   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:10.829650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:10.829713   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:10.866261   54581 cri.go:89] found id: ""
	I1201 19:36:10.866275   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.866293   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:10.866299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:10.866378   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:10.902129   54581 cri.go:89] found id: ""
	I1201 19:36:10.902157   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.902166   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:10.902171   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:10.902287   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:10.935780   54581 cri.go:89] found id: ""
	I1201 19:36:10.935796   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.935803   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:10.935809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:10.935868   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:10.961965   54581 cri.go:89] found id: ""
	I1201 19:36:10.961979   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.961987   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:10.961993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:10.962050   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:10.988752   54581 cri.go:89] found id: ""
	I1201 19:36:10.988765   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.988772   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:10.988778   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:10.988855   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:11.013768   54581 cri.go:89] found id: ""
	I1201 19:36:11.013783   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.013790   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:11.013795   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:11.013852   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:11.039944   54581 cri.go:89] found id: ""
	I1201 19:36:11.039959   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.039982   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:11.039992   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:11.040003   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:11.096281   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:11.096300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:11.107964   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:11.107989   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:11.174240   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:11.174253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:11.174265   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:11.240383   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:11.240406   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.770524   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:13.780691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:13.780754   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:13.805306   54581 cri.go:89] found id: ""
	I1201 19:36:13.805321   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.805328   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:13.805333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:13.805390   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:13.830209   54581 cri.go:89] found id: ""
	I1201 19:36:13.830223   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.830229   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:13.830235   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:13.830294   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:13.859814   54581 cri.go:89] found id: ""
	I1201 19:36:13.859827   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.859834   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:13.859839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:13.859905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:13.888545   54581 cri.go:89] found id: ""
	I1201 19:36:13.888559   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.888567   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:13.888573   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:13.888642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:13.918445   54581 cri.go:89] found id: ""
	I1201 19:36:13.918459   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.918466   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:13.918471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:13.918530   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:13.944112   54581 cri.go:89] found id: ""
	I1201 19:36:13.944125   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.944132   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:13.944147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:13.944206   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:13.969842   54581 cri.go:89] found id: ""
	I1201 19:36:13.969856   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.969863   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:13.969872   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:13.969882   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.999132   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:13.999150   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:14.056959   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:14.056979   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:14.068288   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:14.068304   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:14.137988   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:14.137997   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:14.138008   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:16.704768   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:16.715111   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:16.715170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:16.740051   54581 cri.go:89] found id: ""
	I1201 19:36:16.740065   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.740072   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:16.740078   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:16.740150   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:16.765291   54581 cri.go:89] found id: ""
	I1201 19:36:16.765309   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.765317   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:16.765323   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:16.765380   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:16.790212   54581 cri.go:89] found id: ""
	I1201 19:36:16.790226   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.790233   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:16.790238   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:16.790297   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:16.814700   54581 cri.go:89] found id: ""
	I1201 19:36:16.814714   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.814721   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:16.814726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:16.814785   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:16.851986   54581 cri.go:89] found id: ""
	I1201 19:36:16.852000   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.852007   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:16.852012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:16.852067   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:16.883217   54581 cri.go:89] found id: ""
	I1201 19:36:16.883231   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.883237   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:16.883243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:16.883301   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:16.922552   54581 cri.go:89] found id: ""
	I1201 19:36:16.922566   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.922574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:16.922582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:16.922591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:16.982282   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:16.982300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:16.993387   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:16.993401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:17.063398   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:17.063409   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:17.063421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:17.125575   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:17.125594   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.654741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:19.665378   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:19.665445   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:19.690531   54581 cri.go:89] found id: ""
	I1201 19:36:19.690545   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.690553   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:19.690559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:19.690617   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:19.715409   54581 cri.go:89] found id: ""
	I1201 19:36:19.715423   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.715431   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:19.715436   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:19.715494   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:19.743995   54581 cri.go:89] found id: ""
	I1201 19:36:19.744009   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.744016   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:19.744021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:19.744078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:19.769191   54581 cri.go:89] found id: ""
	I1201 19:36:19.769204   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.769212   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:19.769217   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:19.769286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:19.793617   54581 cri.go:89] found id: ""
	I1201 19:36:19.793631   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.793638   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:19.793644   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:19.793704   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:19.818818   54581 cri.go:89] found id: ""
	I1201 19:36:19.818832   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.818840   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:19.818845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:19.818914   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:19.852332   54581 cri.go:89] found id: ""
	I1201 19:36:19.852346   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.852368   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:19.852378   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:19.852389   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.884627   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:19.884642   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:19.947006   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:19.947026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:19.958524   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:19.958539   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:20.040965   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:20.040976   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:20.040988   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:22.622750   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:22.637572   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:22.637637   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:22.667700   54581 cri.go:89] found id: ""
	I1201 19:36:22.667714   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.667721   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:22.667727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:22.667786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:22.700758   54581 cri.go:89] found id: ""
	I1201 19:36:22.700776   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.700802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:22.700815   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:22.700916   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:22.727217   54581 cri.go:89] found id: ""
	I1201 19:36:22.727230   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.727238   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:22.727243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:22.727299   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:22.753365   54581 cri.go:89] found id: ""
	I1201 19:36:22.753379   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.753386   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:22.753392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:22.753459   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:22.779306   54581 cri.go:89] found id: ""
	I1201 19:36:22.779320   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.779327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:22.779336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:22.779394   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:22.804830   54581 cri.go:89] found id: ""
	I1201 19:36:22.804844   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.804860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:22.804866   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:22.804924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:22.831440   54581 cri.go:89] found id: ""
	I1201 19:36:22.831470   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.831478   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:22.831486   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:22.831496   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:22.889394   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:22.889412   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:22.901968   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:22.901983   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:22.974567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:22.974577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:22.974588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:23.043112   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:23.043130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.573279   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:25.584019   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:25.584078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:25.613416   54581 cri.go:89] found id: ""
	I1201 19:36:25.613430   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.613446   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:25.613452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:25.613541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:25.638108   54581 cri.go:89] found id: ""
	I1201 19:36:25.638121   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.638132   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:25.638138   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:25.638198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:25.667581   54581 cri.go:89] found id: ""
	I1201 19:36:25.667596   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.667603   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:25.667608   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:25.667676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:25.695307   54581 cri.go:89] found id: ""
	I1201 19:36:25.695320   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.695328   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:25.695333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:25.695396   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:25.719360   54581 cri.go:89] found id: ""
	I1201 19:36:25.719386   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.719394   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:25.719399   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:25.719466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:25.745097   54581 cri.go:89] found id: ""
	I1201 19:36:25.745120   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.745127   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:25.745133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:25.745207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:25.769545   54581 cri.go:89] found id: ""
	I1201 19:36:25.769558   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.769565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:25.769573   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:25.769584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.799870   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:25.799887   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:25.856015   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:25.856035   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:25.868391   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:25.868407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:25.939423   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:25.939433   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:25.939443   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.503343   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:28.515763   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:28.515836   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:28.541630   54581 cri.go:89] found id: ""
	I1201 19:36:28.541644   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.541652   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:28.541657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:28.541728   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:28.568196   54581 cri.go:89] found id: ""
	I1201 19:36:28.568210   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.568217   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:28.568222   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:28.568280   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:28.593437   54581 cri.go:89] found id: ""
	I1201 19:36:28.593450   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.593457   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:28.593463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:28.593557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:28.619497   54581 cri.go:89] found id: ""
	I1201 19:36:28.619511   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.619518   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:28.619523   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:28.619583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:28.647866   54581 cri.go:89] found id: ""
	I1201 19:36:28.647880   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.647887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:28.647893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:28.647950   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:28.673922   54581 cri.go:89] found id: ""
	I1201 19:36:28.673935   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.673943   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:28.673949   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:28.674021   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:28.698912   54581 cri.go:89] found id: ""
	I1201 19:36:28.698926   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.698933   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:28.698941   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:28.698963   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:28.756082   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:28.756100   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:28.767897   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:28.767913   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:28.836301   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:28.836312   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:28.836330   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.907788   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:28.907807   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.438620   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:31.448979   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:31.449042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:31.475188   54581 cri.go:89] found id: ""
	I1201 19:36:31.475202   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.475209   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:31.475215   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:31.475281   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:31.500385   54581 cri.go:89] found id: ""
	I1201 19:36:31.500398   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.500405   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:31.500411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:31.500468   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:31.525394   54581 cri.go:89] found id: ""
	I1201 19:36:31.525407   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.525414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:31.525419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:31.525481   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:31.550792   54581 cri.go:89] found id: ""
	I1201 19:36:31.550808   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.550815   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:31.550821   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:31.550880   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:31.578076   54581 cri.go:89] found id: ""
	I1201 19:36:31.578090   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.578097   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:31.578102   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:31.578159   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:31.604021   54581 cri.go:89] found id: ""
	I1201 19:36:31.604035   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.604042   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:31.604047   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:31.604108   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:31.633105   54581 cri.go:89] found id: ""
	I1201 19:36:31.633119   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.633126   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:31.633134   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:31.633145   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.663524   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:31.663540   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:31.723171   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:31.723189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:31.734100   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:31.734115   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:31.796567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:31.796577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:31.796588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:34.366168   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:34.376457   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:34.376516   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:34.404948   54581 cri.go:89] found id: ""
	I1201 19:36:34.404977   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.404985   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:34.404991   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:34.405063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:34.431692   54581 cri.go:89] found id: ""
	I1201 19:36:34.431706   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.431713   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:34.431718   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:34.431779   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:34.456671   54581 cri.go:89] found id: ""
	I1201 19:36:34.456685   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.456692   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:34.456697   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:34.456755   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:34.481585   54581 cri.go:89] found id: ""
	I1201 19:36:34.481612   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.481620   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:34.481626   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:34.481696   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:34.506818   54581 cri.go:89] found id: ""
	I1201 19:36:34.506832   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.506839   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:34.506845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:34.506906   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:34.535407   54581 cri.go:89] found id: ""
	I1201 19:36:34.535421   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.535428   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:34.535433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:34.535492   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:34.561311   54581 cri.go:89] found id: ""
	I1201 19:36:34.561324   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.561331   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:34.561339   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:34.561350   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:34.592150   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:34.592167   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:34.648352   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:34.648370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:34.659451   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:34.659467   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:34.728942   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:34.728952   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:34.728962   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:37.291213   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:37.301261   54581 kubeadm.go:602] duration metric: took 4m4.008784532s to restartPrimaryControlPlane
	W1201 19:36:37.301323   54581 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1201 19:36:37.301393   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:36:37.706665   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:36:37.720664   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:36:37.728529   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:36:37.728581   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:36:37.736430   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:36:37.736440   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:36:37.736492   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:36:37.744494   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:36:37.744550   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:36:37.752457   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:36:37.760187   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:36:37.760243   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:36:37.768060   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.775900   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:36:37.775969   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.783655   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:36:37.791670   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:36:37.791723   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:36:37.799641   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:36:37.841794   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:36:37.841853   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:36:37.909907   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:36:37.909969   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:36:37.910004   54581 kubeadm.go:319] OS: Linux
	I1201 19:36:37.910048   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:36:37.910095   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:36:37.910141   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:36:37.910188   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:36:37.910235   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:36:37.910281   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:36:37.910325   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:36:37.910372   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:36:37.910417   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:36:37.982652   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:36:37.982760   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:36:37.982849   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:36:37.989962   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:36:37.995459   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:36:37.995557   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:36:37.995632   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:36:37.995718   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:36:37.995796   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:36:37.995875   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:36:37.995938   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:36:37.996008   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:36:37.996076   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:36:37.996160   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:36:37.996243   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:36:37.996290   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:36:37.996352   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:36:38.264574   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:36:38.510797   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:36:39.269570   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:36:39.443703   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:36:40.036623   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:36:40.036725   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:36:40.042253   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:36:40.045573   54581 out.go:252]   - Booting up control plane ...
	I1201 19:36:40.045681   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:36:40.045758   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:36:40.050263   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:36:40.088031   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:36:40.088133   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:36:40.088246   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:36:40.088332   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:36:40.088370   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:36:40.243689   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:36:40.243803   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:40:40.243834   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000165379s
	I1201 19:40:40.243866   54581 kubeadm.go:319] 
	I1201 19:40:40.243923   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:40:40.243956   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:40:40.244085   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:40:40.244090   54581 kubeadm.go:319] 
	I1201 19:40:40.244193   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:40:40.244226   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:40:40.244256   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:40:40.244260   54581 kubeadm.go:319] 
	I1201 19:40:40.248975   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:40:40.249435   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:40:40.249566   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:40:40.249901   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 19:40:40.249908   54581 kubeadm.go:319] 
	I1201 19:40:40.249980   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1201 19:40:40.250118   54581 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000165379s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1201 19:40:40.250247   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:40:40.662369   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:40:40.675843   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:40:40.675896   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:40:40.683554   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:40:40.683563   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:40:40.683613   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:40:40.691612   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:40:40.691669   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:40:40.699280   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:40:40.706997   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:40:40.707052   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:40:40.714497   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.722891   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:40:40.722949   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.730907   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:40:40.739761   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:40:40.739818   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:40:40.747474   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:40:40.788983   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:40:40.789292   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:40:40.865634   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:40:40.865697   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:40:40.865734   54581 kubeadm.go:319] OS: Linux
	I1201 19:40:40.865777   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:40:40.865824   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:40:40.865869   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:40:40.865916   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:40:40.865963   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:40:40.866013   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:40:40.866057   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:40:40.866104   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:40:40.866149   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:40:40.935875   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:40:40.935986   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:40:40.936084   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:40:40.941886   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:40:40.947334   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:40:40.947424   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:40:40.947488   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:40:40.947568   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:40:40.947628   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:40:40.947696   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:40:40.947749   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:40:40.947810   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:40:40.947870   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:40:40.947944   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:40:40.948014   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:40:40.948051   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:40:40.948105   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:40:41.580020   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:40:42.099824   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:40:42.537556   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:40:42.996026   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:40:43.565704   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:40:43.566397   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:40:43.569105   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:40:43.572244   54581 out.go:252]   - Booting up control plane ...
	I1201 19:40:43.572342   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:40:43.572765   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:40:43.573983   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:40:43.595015   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:40:43.595116   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:40:43.603073   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:40:43.603347   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:40:43.603559   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:40:43.744445   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:40:43.744558   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:44:43.744318   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000287424s
	I1201 19:44:43.744348   54581 kubeadm.go:319] 
	I1201 19:44:43.744432   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:44:43.744486   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:44:43.744623   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:44:43.744628   54581 kubeadm.go:319] 
	I1201 19:44:43.744749   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:44:43.744781   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:44:43.744822   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:44:43.744831   54581 kubeadm.go:319] 
	I1201 19:44:43.748926   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:44:43.749322   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:44:43.749424   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:44:43.749683   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1201 19:44:43.749689   54581 kubeadm.go:319] 
	I1201 19:44:43.749753   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 19:44:43.749803   54581 kubeadm.go:403] duration metric: took 12m10.492478835s to StartCluster
	I1201 19:44:43.749833   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:44:43.749893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:44:43.774966   54581 cri.go:89] found id: ""
	I1201 19:44:43.774979   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.774986   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:44:43.774992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:44:43.775053   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:44:43.800769   54581 cri.go:89] found id: ""
	I1201 19:44:43.800783   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.800790   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:44:43.800796   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:44:43.800854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:44:43.827282   54581 cri.go:89] found id: ""
	I1201 19:44:43.827295   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.827302   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:44:43.827308   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:44:43.827364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:44:43.853930   54581 cri.go:89] found id: ""
	I1201 19:44:43.853944   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.853951   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:44:43.853957   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:44:43.854013   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:44:43.882816   54581 cri.go:89] found id: ""
	I1201 19:44:43.882830   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.882837   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:44:43.882843   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:44:43.882903   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:44:43.909261   54581 cri.go:89] found id: ""
	I1201 19:44:43.909274   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.909281   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:44:43.909287   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:44:43.909344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:44:43.933693   54581 cri.go:89] found id: ""
	I1201 19:44:43.933706   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.933715   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:44:43.933724   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:44:43.933733   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:44:43.990075   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:44:43.990092   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:44:44.001155   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:44:44.001170   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:44:44.070458   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:44:44.070469   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:44:44.070479   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:44:44.136228   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:44:44.136248   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 19:44:44.166389   54581 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 19:44:44.166422   54581 out.go:285] * 
	W1201 19:44:44.166485   54581 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.166502   54581 out.go:285] * 
	W1201 19:44:44.168627   54581 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:44:44.175592   54581 out.go:203] 
	W1201 19:44:44.179124   54581 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.179186   54581 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 19:44:44.179207   54581 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 19:44:44.182569   54581 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667922542Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667937483Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667949232Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667962787Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668042514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668060319Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668082095Z" level=info msg="runtime interface created"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668088528Z" level=info msg="created NRI interface"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668104946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668151788Z" level=info msg="Connect containerd service"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668662446Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.670384243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682522727Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682782323Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682944050Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682830321Z" level=info msg="Start recovering state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708040674Z" level=info msg="Start event monitor"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708239258Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708327772Z" level=info msg="Start streaming server"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708412037Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708599093Z" level=info msg="runtime interface starting up..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708668573Z" level=info msg="starting plugins..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708729215Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708948574Z" level=info msg="containerd successfully booted in 0.060821s"
	Dec 01 19:32:31 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:44:47.903278   21795 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:47.904018   21795 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:47.905702   21795 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:47.906263   21795 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:47.907945   21795 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:44:47 up  1:27,  0 user,  load average: 0.05, 0.19, 0.39
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:44:44 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:45 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 01 19:44:45 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:45 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:45 functional-428744 kubelet[21661]: E1201 19:44:45.644227   21661 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:45 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:45 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:46 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Dec 01 19:44:46 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:46 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:46 functional-428744 kubelet[21676]: E1201 19:44:46.403870   21676 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:46 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:46 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:47 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 324.
	Dec 01 19:44:47 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:47 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:47 functional-428744 kubelet[21711]: E1201 19:44:47.149314   21711 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:47 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:47 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:44:47 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 325.
	Dec 01 19:44:47 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:47 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:44:47 functional-428744 kubelet[21794]: E1201 19:44:47.892843   21794 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:44:47 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:44:47 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (354.481514ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth (2.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-428744 apply -f testdata/invalidsvc.yaml
functional_test.go:2326: (dbg) Non-zero exit: kubectl --context functional-428744 apply -f testdata/invalidsvc.yaml: exit status 1 (67.462727ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/invalidsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test.go:2328: kubectl --context functional-428744 apply -f testdata/invalidsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd (1.71s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-428744 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-428744 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-428744 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-428744 --alsologtostderr -v=1] stderr:
I1201 19:46:57.612687   71931 out.go:360] Setting OutFile to fd 1 ...
I1201 19:46:57.613369   71931 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:46:57.613393   71931 out.go:374] Setting ErrFile to fd 2...
I1201 19:46:57.613410   71931 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:46:57.613744   71931 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:46:57.614030   71931 mustload.go:66] Loading cluster: functional-428744
I1201 19:46:57.614475   71931 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:46:57.614992   71931 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:46:57.645621   71931 host.go:66] Checking if "functional-428744" exists ...
I1201 19:46:57.645928   71931 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1201 19:46:57.706286   71931 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.694909801 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1201 19:46:57.706414   71931 api_server.go:166] Checking apiserver status ...
I1201 19:46:57.706480   71931 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1201 19:46:57.706524   71931 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:46:57.724862   71931 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
W1201 19:46:57.831134   71931 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1201 19:46:57.834305   71931 out.go:179] * The control-plane node functional-428744 apiserver is not running: (state=Stopped)
I1201 19:46:57.837195   71931 out.go:179]   To start a cluster, run: "minikube start -p functional-428744"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (314.084243ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service   │ functional-428744 service hello-node --url --format={{.IP}}                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ service   │ functional-428744 service hello-node --url                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh -- ls -la /mount-9p                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh cat /mount-9p/test-1764618408110613921                                                                                        │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh sudo umount -f /mount-9p                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo894755151/001:/mount-9p --alsologtostderr -v=1 --port 46464  │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh -- ls -la /mount-9p                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh sudo umount -f /mount-9p                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount1 --alsologtostderr -v=1                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount2 --alsologtostderr -v=1                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount1                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount3 --alsologtostderr -v=1                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount2                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh findmnt -T /mount3                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount     │ -p functional-428744 --kill=true                                                                                                                    │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ start     │ -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ start     │ -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ start     │ -p functional-428744 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-428744 --alsologtostderr -v=1                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:46:57
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:46:57.356677   71859 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:46:57.356876   71859 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.356902   71859 out.go:374] Setting ErrFile to fd 2...
	I1201 19:46:57.356926   71859 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.357228   71859 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:46:57.357689   71859 out.go:368] Setting JSON to false
	I1201 19:46:57.358532   71859 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":5369,"bootTime":1764613049,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:46:57.358629   71859 start.go:143] virtualization:  
	I1201 19:46:57.361983   71859 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:46:57.365564   71859 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:46:57.365715   71859 notify.go:221] Checking for updates...
	I1201 19:46:57.371851   71859 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:46:57.374888   71859 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:46:57.377862   71859 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:46:57.380678   71859 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:46:57.383580   71859 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:46:57.386969   71859 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:46:57.387571   71859 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:46:57.417643   71859 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:46:57.417826   71859 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.480404   71859 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.471331211 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.480521   71859 docker.go:319] overlay module found
	I1201 19:46:57.483650   71859 out.go:179] * Using the docker driver based on existing profile
	I1201 19:46:57.486552   71859 start.go:309] selected driver: docker
	I1201 19:46:57.486579   71859 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.486673   71859 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:46:57.486786   71859 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.541825   71859 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.532967583 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.542260   71859 cni.go:84] Creating CNI manager for ""
	I1201 19:46:57.542331   71859 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:46:57.542373   71859 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.547200   71859 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667922542Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667937483Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667949232Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667962787Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668042514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668060319Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668082095Z" level=info msg="runtime interface created"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668088528Z" level=info msg="created NRI interface"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668104946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668151788Z" level=info msg="Connect containerd service"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668662446Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.670384243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682522727Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682782323Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682944050Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682830321Z" level=info msg="Start recovering state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708040674Z" level=info msg="Start event monitor"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708239258Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708327772Z" level=info msg="Start streaming server"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708412037Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708599093Z" level=info msg="runtime interface starting up..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708668573Z" level=info msg="starting plugins..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708729215Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708948574Z" level=info msg="containerd successfully booted in 0.060821s"
	Dec 01 19:32:31 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:46:58.874504   23923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:58.874914   23923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:58.876459   23923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:58.876819   23923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:58.878292   23923 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:46:58 up  1:29,  0 user,  load average: 1.17, 0.43, 0.43
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:46:55 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:56 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 496.
	Dec 01 19:46:56 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:56 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:56 functional-428744 kubelet[23783]: E1201 19:46:56.168171   23783 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:56 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:56 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:56 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 497.
	Dec 01 19:46:56 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:56 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:56 functional-428744 kubelet[23802]: E1201 19:46:56.876870   23802 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:56 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:56 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:57 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 498.
	Dec 01 19:46:57 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:57 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:57 functional-428744 kubelet[23810]: E1201 19:46:57.655737   23810 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:57 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:57 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:58 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 499.
	Dec 01 19:46:58 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:58 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:58 functional-428744 kubelet[23840]: E1201 19:46:58.394928   23840 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:58 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:58 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (301.694406ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd (1.71s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd (3.19s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 status
functional_test.go:869: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 status: exit status 2 (364.774442ms)

                                                
                                                
-- stdout --
	functional-428744
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
functional_test.go:871: failed to run minikube status. args "out/minikube-linux-arm64 -p functional-428744 status" : exit status 2
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:875: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}: exit status 2 (314.288312ms)

                                                
                                                
-- stdout --
	host:Running,kublet:Stopped,apiserver:Stopped,kubeconfig:Configured

                                                
                                                
-- /stdout --
functional_test.go:877: failed to run minikube status with custom format: args "out/minikube-linux-arm64 -p functional-428744 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}": exit status 2
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 status -o json
functional_test.go:887: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 status -o json: exit status 2 (325.494245ms)

                                                
                                                
-- stdout --
	{"Name":"functional-428744","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
functional_test.go:889: failed to run minikube status with json output. args "out/minikube-linux-arm64 -p functional-428744 status -o json" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (331.348349ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                        ARGS                                                                        │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ addons  │ functional-428744 addons list -o json                                                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ service │ functional-428744 service list                                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ service │ functional-428744 service list -o json                                                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ service │ functional-428744 service --namespace=default --https --url hello-node                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ service │ functional-428744 service hello-node --url --format={{.IP}}                                                                                        │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ service │ functional-428744 service hello-node --url                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount   │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh     │ functional-428744 ssh -- ls -la /mount-9p                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh     │ functional-428744 ssh cat /mount-9p/test-1764618408110613921                                                                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh     │ functional-428744 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh sudo umount -f /mount-9p                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount   │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo894755151/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh     │ functional-428744 ssh -- ls -la /mount-9p                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh     │ functional-428744 ssh sudo umount -f /mount-9p                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount   │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount1 --alsologtostderr -v=1               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount   │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount2 --alsologtostderr -v=1               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh findmnt -T /mount1                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount   │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount3 --alsologtostderr -v=1               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh     │ functional-428744 ssh findmnt -T /mount2                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh     │ functional-428744 ssh findmnt -T /mount3                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount   │ -p functional-428744 --kill=true                                                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	└─────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:32:28
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:32:28.671063   54581 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:32:28.671177   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671181   54581 out.go:374] Setting ErrFile to fd 2...
	I1201 19:32:28.671185   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671462   54581 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:32:28.671791   54581 out.go:368] Setting JSON to false
	I1201 19:32:28.672593   54581 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4500,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:32:28.672645   54581 start.go:143] virtualization:  
	I1201 19:32:28.676118   54581 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:32:28.679062   54581 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:32:28.679153   54581 notify.go:221] Checking for updates...
	I1201 19:32:28.685968   54581 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:32:28.688852   54581 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:32:28.691733   54581 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:32:28.694613   54581 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:32:28.697549   54581 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:32:28.700837   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:28.700934   54581 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:32:28.730800   54581 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:32:28.730894   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.786972   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.776963779 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.787065   54581 docker.go:319] overlay module found
	I1201 19:32:28.789990   54581 out.go:179] * Using the docker driver based on existing profile
	I1201 19:32:28.792702   54581 start.go:309] selected driver: docker
	I1201 19:32:28.792712   54581 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.792814   54581 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:32:28.792926   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.854079   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.841219008 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.854498   54581 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1201 19:32:28.854520   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:28.854580   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:28.854619   54581 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.858061   54581 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:32:28.860972   54581 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:32:28.863997   54581 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:32:28.866788   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:28.866980   54581 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:32:28.895611   54581 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:32:28.895623   54581 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:32:28.922565   54581 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:32:29.117617   54581 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:32:29.117759   54581 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:32:29.117789   54581 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117872   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:32:29.117882   54581 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 108.863µs
	I1201 19:32:29.117888   54581 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:32:29.117898   54581 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117926   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:32:29.117930   54581 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 33.443µs
	I1201 19:32:29.117935   54581 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117944   54581 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117979   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:32:29.117983   54581 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.647µs
	I1201 19:32:29.117988   54581 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117998   54581 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118023   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:32:29.118035   54581 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 30.974µs
	I1201 19:32:29.118040   54581 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118040   54581 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:32:29.118048   54581 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118072   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:32:29.118066   54581 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118075   54581 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 28.709µs
	I1201 19:32:29.118080   54581 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118088   54581 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118102   54581 start.go:364] duration metric: took 25.197µs to acquireMachinesLock for "functional-428744"
	I1201 19:32:29.118113   54581 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:32:29.118114   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:32:29.118117   54581 fix.go:54] fixHost starting: 
	I1201 19:32:29.118118   54581 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.457µs
	I1201 19:32:29.118122   54581 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:32:29.118129   54581 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118152   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:32:29.118156   54581 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 27.199µs
	I1201 19:32:29.118160   54581 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:32:29.118167   54581 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118216   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:32:29.118220   54581 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 53.562µs
	I1201 19:32:29.118229   54581 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:32:29.118236   54581 cache.go:87] Successfully saved all images to host disk.
	I1201 19:32:29.118392   54581 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:32:29.135509   54581 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:32:29.135543   54581 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:32:29.140504   54581 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:32:29.140530   54581 machine.go:94] provisionDockerMachine start ...
	I1201 19:32:29.140609   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.157677   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.157997   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.158004   54581 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:32:29.305012   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.305026   54581 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:32:29.305098   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.323134   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.323429   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.323437   54581 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:32:29.478458   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.478532   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.497049   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.498161   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.498184   54581 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:32:29.645663   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:32:29.645679   54581 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:32:29.645696   54581 ubuntu.go:190] setting up certificates
	I1201 19:32:29.645703   54581 provision.go:84] configureAuth start
	I1201 19:32:29.645772   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:29.663161   54581 provision.go:143] copyHostCerts
	I1201 19:32:29.663227   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:32:29.663233   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:32:29.663306   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:32:29.663413   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:32:29.663416   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:32:29.663441   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:32:29.663488   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:32:29.663496   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:32:29.663517   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:32:29.663560   54581 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:32:29.922590   54581 provision.go:177] copyRemoteCerts
	I1201 19:32:29.922645   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:32:29.922682   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.944750   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.066257   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:32:30.114189   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:32:30.139869   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:32:30.162018   54581 provision.go:87] duration metric: took 516.289617ms to configureAuth
	I1201 19:32:30.162044   54581 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:32:30.162294   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:30.162301   54581 machine.go:97] duration metric: took 1.021765793s to provisionDockerMachine
	I1201 19:32:30.162308   54581 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:32:30.162319   54581 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:32:30.162368   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:32:30.162422   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.181979   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.285977   54581 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:32:30.289531   54581 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:32:30.289549   54581 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:32:30.289559   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:32:30.289616   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:32:30.289694   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:32:30.289767   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:32:30.289821   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:32:30.297763   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:30.315893   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:32:30.335096   54581 start.go:296] duration metric: took 172.774471ms for postStartSetup
	I1201 19:32:30.335168   54581 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:32:30.335214   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.355398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.458545   54581 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:32:30.463103   54581 fix.go:56] duration metric: took 1.344978374s for fixHost
	I1201 19:32:30.463118   54581 start.go:83] releasing machines lock for "functional-428744", held for 1.345010357s
	I1201 19:32:30.463185   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:30.480039   54581 ssh_runner.go:195] Run: cat /version.json
	I1201 19:32:30.480081   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.480337   54581 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:32:30.480395   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.499221   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.501398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.601341   54581 ssh_runner.go:195] Run: systemctl --version
	I1201 19:32:30.695138   54581 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 19:32:30.699523   54581 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:32:30.699612   54581 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:32:30.707379   54581 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:32:30.707392   54581 start.go:496] detecting cgroup driver to use...
	I1201 19:32:30.707423   54581 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:32:30.707469   54581 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:32:30.722782   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:32:30.736023   54581 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:32:30.736084   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:32:30.751857   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:32:30.765106   54581 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:32:30.881005   54581 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:32:31.019194   54581 docker.go:234] disabling docker service ...
	I1201 19:32:31.019259   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:32:31.037044   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:32:31.052926   54581 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:32:31.181456   54581 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:32:31.340481   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:32:31.355001   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:32:31.370840   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:32:31.380231   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:32:31.389693   54581 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:32:31.389764   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:32:31.399360   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.408437   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:32:31.417370   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.426455   54581 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:32:31.434636   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:32:31.443735   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:32:31.453324   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:32:31.462516   54581 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:32:31.470270   54581 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:32:31.478172   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:31.592137   54581 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:32:31.712107   54581 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:32:31.712186   54581 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:32:31.715994   54581 start.go:564] Will wait 60s for crictl version
	I1201 19:32:31.716056   54581 ssh_runner.go:195] Run: which crictl
	I1201 19:32:31.719610   54581 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:32:31.745073   54581 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:32:31.745152   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.765358   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.791628   54581 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:32:31.794721   54581 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:32:31.811133   54581 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:32:31.818179   54581 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1201 19:32:31.821064   54581 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:32:31.821193   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:31.821269   54581 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:32:31.856356   54581 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:32:31.856368   54581 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:32:31.856374   54581 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:32:31.856475   54581 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:32:31.856536   54581 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:32:31.895308   54581 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1201 19:32:31.895325   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:31.895333   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:31.895346   54581 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:32:31.895366   54581 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:32:31.895478   54581 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:32:31.895541   54581 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:32:31.905339   54581 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:32:31.905406   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:32:31.913323   54581 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:32:31.927846   54581 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:32:31.940396   54581 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1201 19:32:31.953139   54581 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:32:31.956806   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:32.073166   54581 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:32:32.587407   54581 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:32:32.587419   54581 certs.go:195] generating shared ca certs ...
	I1201 19:32:32.587436   54581 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:32:32.587628   54581 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:32:32.587672   54581 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:32:32.587679   54581 certs.go:257] generating profile certs ...
	I1201 19:32:32.587796   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:32:32.587858   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:32:32.587895   54581 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:32:32.588027   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:32:32.588060   54581 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:32:32.588067   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:32:32.588104   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:32:32.588128   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:32:32.588158   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:32:32.588202   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:32.589935   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:32:32.611510   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:32:32.631449   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:32:32.652864   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:32:32.672439   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:32:32.690857   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:32:32.709160   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:32:32.727076   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:32:32.745055   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:32:32.762625   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:32:32.780355   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:32:32.797626   54581 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:32:32.810250   54581 ssh_runner.go:195] Run: openssl version
	I1201 19:32:32.816425   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:32:32.825294   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829094   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829148   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.869893   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:32:32.877720   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:32:32.886198   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889911   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889967   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.930479   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:32:32.938463   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:32:32.946940   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950621   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950676   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.991499   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:32:32.999452   54581 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:32:33.003313   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:32:33.045305   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:32:33.087269   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:32:33.128376   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:32:33.169796   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:32:33.211259   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:32:33.257335   54581 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:33.257412   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:32:33.257501   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.284260   54581 cri.go:89] found id: ""
	I1201 19:32:33.284320   54581 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:32:33.292458   54581 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:32:33.292468   54581 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:32:33.292518   54581 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:32:33.300158   54581 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.300668   54581 kubeconfig.go:125] found "functional-428744" server: "https://192.168.49.2:8441"
	I1201 19:32:33.301960   54581 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:32:33.310120   54581 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-01 19:17:59.066738599 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-01 19:32:31.946987775 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1201 19:32:33.310138   54581 kubeadm.go:1161] stopping kube-system containers ...
	I1201 19:32:33.310149   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1201 19:32:33.310213   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.338492   54581 cri.go:89] found id: ""
	I1201 19:32:33.338551   54581 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1201 19:32:33.356342   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:32:33.364607   54581 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec  1 19:22 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec  1 19:22 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5676 Dec  1 19:22 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5588 Dec  1 19:22 /etc/kubernetes/scheduler.conf
	
	I1201 19:32:33.364669   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:32:33.372608   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:32:33.380647   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.380700   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:32:33.388464   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.397123   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.397189   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.404816   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:32:33.412562   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.412628   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:32:33.420390   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:32:33.428330   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:33.477124   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.484075   54581 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.006926734s)
	I1201 19:32:34.484135   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.694382   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.769616   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.812433   54581 api_server.go:52] waiting for apiserver process to appear ...
	I1201 19:32:34.812505   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.313033   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.812993   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.312704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.813245   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.313300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.812687   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.312636   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.813205   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.813572   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.312587   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.812696   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.313535   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.813472   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.813224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.313067   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.813328   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.312678   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.813484   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.312731   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.812683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.313429   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.813026   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.312606   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.812689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.813689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.313474   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.312618   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.813410   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.313371   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.812979   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.312792   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.812691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.313042   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.813445   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.313212   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.812741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.312722   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.812580   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.313621   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.813459   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.313224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.812880   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.313609   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.313283   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.812739   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.313558   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.813248   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.313098   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.813623   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.313600   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.813357   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.312559   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.812827   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.312653   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.812616   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.313447   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.813117   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.312712   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.812713   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.314198   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.313642   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.813457   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.313464   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.812697   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.312626   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.813299   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.813267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.312931   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.812887   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.312894   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.813197   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.312689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.812595   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.313557   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.313428   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.813327   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.313520   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.812744   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.313564   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.812611   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.313634   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.813393   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.313426   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.812688   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.313372   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.812638   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.313360   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.812897   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.313015   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.813101   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.312709   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.812907   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.312644   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.812569   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.313009   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.813448   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.312851   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.813268   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.313602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.312692   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.813538   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.313307   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.813008   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.313397   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.313454   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.813423   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.313344   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.813145   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.312690   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.813369   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:34.813443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:34.847624   54581 cri.go:89] found id: ""
	I1201 19:33:34.847638   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.847645   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:34.847650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:34.847707   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:34.877781   54581 cri.go:89] found id: ""
	I1201 19:33:34.877795   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.877802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:34.877807   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:34.877865   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:34.906556   54581 cri.go:89] found id: ""
	I1201 19:33:34.906569   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.906575   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:34.906581   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:34.906638   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:34.932243   54581 cri.go:89] found id: ""
	I1201 19:33:34.932257   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.932264   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:34.932275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:34.932334   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:34.958307   54581 cri.go:89] found id: ""
	I1201 19:33:34.958320   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.958327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:34.958333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:34.958393   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:34.987839   54581 cri.go:89] found id: ""
	I1201 19:33:34.987852   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.987860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:34.987865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:34.987924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:35.013339   54581 cri.go:89] found id: ""
	I1201 19:33:35.013353   54581 logs.go:282] 0 containers: []
	W1201 19:33:35.013360   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:35.013367   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:35.013377   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:35.024284   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:35.024300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:35.102562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:35.102584   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:35.102595   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:35.168823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:35.168843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:35.200459   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:35.200475   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:37.759267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:37.769446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:37.769528   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:37.794441   54581 cri.go:89] found id: ""
	I1201 19:33:37.794454   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.794461   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:37.794467   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:37.794522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:37.825029   54581 cri.go:89] found id: ""
	I1201 19:33:37.825042   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.825049   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:37.825059   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:37.825116   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:37.855847   54581 cri.go:89] found id: ""
	I1201 19:33:37.855860   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.855867   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:37.855872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:37.855932   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:37.892812   54581 cri.go:89] found id: ""
	I1201 19:33:37.892826   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.892833   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:37.892839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:37.892902   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:37.923175   54581 cri.go:89] found id: ""
	I1201 19:33:37.923189   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.923195   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:37.923201   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:37.923260   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:37.956838   54581 cri.go:89] found id: ""
	I1201 19:33:37.956852   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.956858   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:37.956864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:37.956921   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:37.983288   54581 cri.go:89] found id: ""
	I1201 19:33:37.983302   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.983309   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:37.983317   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:37.983328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:38.048803   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:38.048828   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:38.048842   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:38.114525   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:38.114549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:38.144040   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:38.144056   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:38.203160   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:38.203178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:40.714632   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:40.724993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:40.725058   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:40.749954   54581 cri.go:89] found id: ""
	I1201 19:33:40.749968   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.749975   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:40.749981   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:40.750040   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:40.775337   54581 cri.go:89] found id: ""
	I1201 19:33:40.775350   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.775357   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:40.775362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:40.775425   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:40.801568   54581 cri.go:89] found id: ""
	I1201 19:33:40.801582   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.801590   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:40.801595   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:40.801663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:40.829766   54581 cri.go:89] found id: ""
	I1201 19:33:40.829779   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.829786   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:40.829791   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:40.829850   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:40.864362   54581 cri.go:89] found id: ""
	I1201 19:33:40.864376   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.864383   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:40.864389   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:40.864447   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:40.893407   54581 cri.go:89] found id: ""
	I1201 19:33:40.893419   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.893427   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:40.893433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:40.893507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:40.919149   54581 cri.go:89] found id: ""
	I1201 19:33:40.919163   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.919172   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:40.919179   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:40.919189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:40.949474   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:40.949572   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:41.005421   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:41.005440   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:41.016259   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:41.016274   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:41.078378   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:41.078391   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:41.078401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:43.641960   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:43.652106   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:43.652178   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:43.682005   54581 cri.go:89] found id: ""
	I1201 19:33:43.682018   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.682025   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:43.682030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:43.682087   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:43.707580   54581 cri.go:89] found id: ""
	I1201 19:33:43.707593   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.707600   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:43.707606   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:43.707711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:43.732400   54581 cri.go:89] found id: ""
	I1201 19:33:43.732414   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.732421   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:43.732426   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:43.732483   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:43.758218   54581 cri.go:89] found id: ""
	I1201 19:33:43.758232   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.758239   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:43.758245   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:43.758303   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:43.783139   54581 cri.go:89] found id: ""
	I1201 19:33:43.783152   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.783159   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:43.783164   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:43.783227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:43.813453   54581 cri.go:89] found id: ""
	I1201 19:33:43.813467   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.813474   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:43.813480   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:43.813548   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:43.845612   54581 cri.go:89] found id: ""
	I1201 19:33:43.845625   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.845632   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:43.845639   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:43.845649   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:43.909426   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:43.909445   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:43.920543   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:43.920560   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:43.988764   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:43.988776   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:43.988797   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:44.051182   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:44.051208   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:46.583925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:46.594468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:46.594554   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:46.620265   54581 cri.go:89] found id: ""
	I1201 19:33:46.620279   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.620286   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:46.620292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:46.620351   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:46.644633   54581 cri.go:89] found id: ""
	I1201 19:33:46.644652   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.644659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:46.644665   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:46.644721   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:46.669867   54581 cri.go:89] found id: ""
	I1201 19:33:46.669881   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.669888   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:46.669893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:46.669948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:46.694417   54581 cri.go:89] found id: ""
	I1201 19:33:46.694431   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.694438   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:46.694454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:46.694512   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:46.721029   54581 cri.go:89] found id: ""
	I1201 19:33:46.721043   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.721051   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:46.721056   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:46.721114   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:46.747445   54581 cri.go:89] found id: ""
	I1201 19:33:46.747459   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.747466   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:46.747471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:46.747525   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:46.771251   54581 cri.go:89] found id: ""
	I1201 19:33:46.771266   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.771272   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:46.771281   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:46.771290   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:46.829699   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:46.829716   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:46.842077   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:46.842096   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:46.924213   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:46.924225   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:46.924235   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:46.990853   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:46.990872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:49.521683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:49.531974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:49.532042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:49.557473   54581 cri.go:89] found id: ""
	I1201 19:33:49.557514   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.557521   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:49.557527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:49.557640   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:49.583182   54581 cri.go:89] found id: ""
	I1201 19:33:49.583229   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.583237   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:49.583242   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:49.583308   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:49.611533   54581 cri.go:89] found id: ""
	I1201 19:33:49.611546   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.611553   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:49.611559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:49.611615   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:49.637433   54581 cri.go:89] found id: ""
	I1201 19:33:49.637446   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.637460   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:49.637466   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:49.637558   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:49.667274   54581 cri.go:89] found id: ""
	I1201 19:33:49.667287   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.667294   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:49.667299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:49.667358   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:49.696772   54581 cri.go:89] found id: ""
	I1201 19:33:49.696790   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.696797   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:49.696803   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:49.696861   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:49.720607   54581 cri.go:89] found id: ""
	I1201 19:33:49.720621   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.720637   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:49.720645   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:49.720655   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:49.776412   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:49.776431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:49.787417   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:49.787432   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:49.862636   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:49.862647   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:49.862658   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:49.934395   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:49.934421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:52.463339   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:52.473586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:52.473650   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:52.503529   54581 cri.go:89] found id: ""
	I1201 19:33:52.503542   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.503549   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:52.503555   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:52.503618   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:52.531144   54581 cri.go:89] found id: ""
	I1201 19:33:52.531158   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.531165   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:52.531170   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:52.531228   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:52.556664   54581 cri.go:89] found id: ""
	I1201 19:33:52.556678   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.556685   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:52.556691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:52.556753   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:52.583782   54581 cri.go:89] found id: ""
	I1201 19:33:52.583796   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.583802   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:52.583808   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:52.583866   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:52.608468   54581 cri.go:89] found id: ""
	I1201 19:33:52.608481   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.608488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:52.608494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:52.608553   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:52.632068   54581 cri.go:89] found id: ""
	I1201 19:33:52.632081   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.632088   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:52.632093   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:52.632153   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:52.656905   54581 cri.go:89] found id: ""
	I1201 19:33:52.656919   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.656926   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:52.656934   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:52.656944   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:52.715322   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:52.715340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:52.725941   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:52.725956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:52.787814   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:52.787824   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:52.787835   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:52.857124   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:52.857146   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:55.384601   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:55.394657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:55.394724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:55.419003   54581 cri.go:89] found id: ""
	I1201 19:33:55.419016   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.419023   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:55.419028   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:55.419093   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:55.444043   54581 cri.go:89] found id: ""
	I1201 19:33:55.444057   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.444064   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:55.444069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:55.444126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:55.469199   54581 cri.go:89] found id: ""
	I1201 19:33:55.469212   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.469219   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:55.469224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:55.469284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:55.494106   54581 cri.go:89] found id: ""
	I1201 19:33:55.494123   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.494130   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:55.494135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:55.494192   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:55.523658   54581 cri.go:89] found id: ""
	I1201 19:33:55.523671   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.523678   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:55.523683   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:55.523742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:55.549084   54581 cri.go:89] found id: ""
	I1201 19:33:55.549097   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.549105   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:55.549110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:55.549171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:55.573973   54581 cri.go:89] found id: ""
	I1201 19:33:55.573986   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.573993   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:55.574001   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:55.574014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:55.629601   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:55.629618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:55.640511   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:55.640527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:55.703852   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:55.703862   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:55.703875   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:55.767135   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:55.767154   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.297608   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:58.307660   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:58.307729   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:58.331936   54581 cri.go:89] found id: ""
	I1201 19:33:58.331948   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.331955   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:58.331961   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:58.332023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:58.356515   54581 cri.go:89] found id: ""
	I1201 19:33:58.356528   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.356535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:58.356544   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:58.356601   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:58.381178   54581 cri.go:89] found id: ""
	I1201 19:33:58.381191   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.381198   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:58.381203   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:58.381259   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:58.405890   54581 cri.go:89] found id: ""
	I1201 19:33:58.405904   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.405911   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:58.405916   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:58.405971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:58.429783   54581 cri.go:89] found id: ""
	I1201 19:33:58.429796   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.429804   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:58.429809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:58.429875   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:58.454357   54581 cri.go:89] found id: ""
	I1201 19:33:58.454370   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.454377   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:58.454383   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:58.454443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:58.483382   54581 cri.go:89] found id: ""
	I1201 19:33:58.483395   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.483403   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:58.483410   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:58.483421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:58.494465   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:58.494480   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:58.557097   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:58.557108   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:58.557119   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:58.624200   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:58.624219   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.654678   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:58.654694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.213704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:01.225298   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:01.225360   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:01.251173   54581 cri.go:89] found id: ""
	I1201 19:34:01.251187   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.251194   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:01.251200   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:01.251272   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:01.278884   54581 cri.go:89] found id: ""
	I1201 19:34:01.278897   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.278904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:01.278910   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:01.278967   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:01.305393   54581 cri.go:89] found id: ""
	I1201 19:34:01.305407   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.305414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:01.305419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:01.305522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:01.331958   54581 cri.go:89] found id: ""
	I1201 19:34:01.331971   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.331978   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:01.331983   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:01.332042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:01.357701   54581 cri.go:89] found id: ""
	I1201 19:34:01.357714   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.357721   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:01.357727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:01.357786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:01.384631   54581 cri.go:89] found id: ""
	I1201 19:34:01.384645   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.384662   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:01.384668   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:01.384742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:01.410554   54581 cri.go:89] found id: ""
	I1201 19:34:01.410567   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.410574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:01.410582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:01.410591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.466596   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:01.466614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:01.477827   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:01.477843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:01.543509   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:01.543518   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:01.543529   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:01.606587   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:01.606608   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:04.136300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:04.146336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:04.146412   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:04.177880   54581 cri.go:89] found id: ""
	I1201 19:34:04.177894   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.177901   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:04.177906   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:04.177971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:04.203986   54581 cri.go:89] found id: ""
	I1201 19:34:04.203999   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.204006   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:04.204012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:04.204068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:04.228899   54581 cri.go:89] found id: ""
	I1201 19:34:04.228912   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.228920   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:04.228925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:04.228989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:04.254700   54581 cri.go:89] found id: ""
	I1201 19:34:04.254715   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.254722   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:04.254729   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:04.254788   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:04.280370   54581 cri.go:89] found id: ""
	I1201 19:34:04.280383   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.280390   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:04.280396   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:04.280453   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:04.304821   54581 cri.go:89] found id: ""
	I1201 19:34:04.304834   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.304842   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:04.304847   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:04.304910   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:04.331513   54581 cri.go:89] found id: ""
	I1201 19:34:04.331525   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.331533   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:04.331540   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:04.331550   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:04.390353   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:04.390371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:04.403182   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:04.403198   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:04.471239   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:04.471261   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:04.471273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:04.534546   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:04.534567   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:07.063925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:07.074362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:07.074427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:07.107919   54581 cri.go:89] found id: ""
	I1201 19:34:07.107933   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.107940   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:07.107946   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:07.108003   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:07.137952   54581 cri.go:89] found id: ""
	I1201 19:34:07.137965   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.137973   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:07.137978   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:07.138038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:07.172024   54581 cri.go:89] found id: ""
	I1201 19:34:07.172037   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.172044   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:07.172049   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:07.172107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:07.196732   54581 cri.go:89] found id: ""
	I1201 19:34:07.196745   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.196752   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:07.196759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:07.196814   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:07.221862   54581 cri.go:89] found id: ""
	I1201 19:34:07.221875   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.221882   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:07.221888   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:07.221947   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:07.249751   54581 cri.go:89] found id: ""
	I1201 19:34:07.249765   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.249771   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:07.249777   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:07.249833   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:07.275027   54581 cri.go:89] found id: ""
	I1201 19:34:07.275040   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.275047   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:07.275055   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:07.275065   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:07.330139   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:07.330156   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:07.341431   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:07.341447   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:07.404752   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:07.404762   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:07.404780   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:07.471227   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:07.471244   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.003255   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:10.013892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:10.013949   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:10.059011   54581 cri.go:89] found id: ""
	I1201 19:34:10.059025   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.059033   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:10.059039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:10.059101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:10.096138   54581 cri.go:89] found id: ""
	I1201 19:34:10.096152   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.096170   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:10.096177   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:10.096282   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:10.138539   54581 cri.go:89] found id: ""
	I1201 19:34:10.138600   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.138612   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:10.138618   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:10.138688   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:10.168476   54581 cri.go:89] found id: ""
	I1201 19:34:10.168490   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.168497   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:10.168502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:10.168580   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:10.194454   54581 cri.go:89] found id: ""
	I1201 19:34:10.194480   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.194487   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:10.194493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:10.194560   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:10.219419   54581 cri.go:89] found id: ""
	I1201 19:34:10.219432   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.219439   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:10.219445   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:10.219507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:10.244925   54581 cri.go:89] found id: ""
	I1201 19:34:10.244938   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.244945   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:10.244953   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:10.244964   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:10.311653   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:10.311663   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:10.311673   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:10.377857   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:10.377877   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.407833   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:10.407851   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:10.467737   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:10.467757   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:12.980376   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:12.990779   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:12.990838   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:13.016106   54581 cri.go:89] found id: ""
	I1201 19:34:13.016120   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.016127   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:13.016133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:13.016198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:13.044361   54581 cri.go:89] found id: ""
	I1201 19:34:13.044375   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.044382   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:13.044387   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:13.044444   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:13.069827   54581 cri.go:89] found id: ""
	I1201 19:34:13.069841   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.069849   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:13.069854   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:13.069913   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:13.110851   54581 cri.go:89] found id: ""
	I1201 19:34:13.110864   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.110871   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:13.110876   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:13.110933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:13.141612   54581 cri.go:89] found id: ""
	I1201 19:34:13.141626   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.141633   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:13.141638   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:13.141695   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:13.168579   54581 cri.go:89] found id: ""
	I1201 19:34:13.168592   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.168599   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:13.168604   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:13.168676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:13.194182   54581 cri.go:89] found id: ""
	I1201 19:34:13.194196   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.194204   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:13.194211   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:13.194221   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:13.255821   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:13.255840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:13.267071   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:13.267087   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:13.336403   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:13.336424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:13.336434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:13.399839   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:13.399859   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:15.930208   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:15.940605   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:15.940671   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:15.966201   54581 cri.go:89] found id: ""
	I1201 19:34:15.966215   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.966223   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:15.966228   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:15.966291   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:15.996515   54581 cri.go:89] found id: ""
	I1201 19:34:15.996528   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.996535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:15.996541   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:15.996598   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:16.022535   54581 cri.go:89] found id: ""
	I1201 19:34:16.022550   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.022564   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:16.022569   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:16.022630   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:16.057222   54581 cri.go:89] found id: ""
	I1201 19:34:16.057236   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.057246   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:16.057252   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:16.057313   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:16.087879   54581 cri.go:89] found id: ""
	I1201 19:34:16.087893   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.087900   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:16.087905   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:16.087965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:16.120946   54581 cri.go:89] found id: ""
	I1201 19:34:16.120960   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.120968   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:16.120974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:16.121035   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:16.154523   54581 cri.go:89] found id: ""
	I1201 19:34:16.154538   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.154544   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:16.154552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:16.154562   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:16.227282   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:16.227292   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:16.227303   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:16.291304   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:16.291323   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:16.320283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:16.320299   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:16.379997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:16.380014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:18.891691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:18.901502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:18.901561   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:18.926115   54581 cri.go:89] found id: ""
	I1201 19:34:18.926128   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.926135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:18.926141   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:18.926212   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:18.951977   54581 cri.go:89] found id: ""
	I1201 19:34:18.951991   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.951998   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:18.952003   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:18.952068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:18.983248   54581 cri.go:89] found id: ""
	I1201 19:34:18.983266   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.983273   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:18.983278   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:18.983342   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:19.010990   54581 cri.go:89] found id: ""
	I1201 19:34:19.011010   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.011018   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:19.011024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:19.011086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:19.036672   54581 cri.go:89] found id: ""
	I1201 19:34:19.036686   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.036693   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:19.036699   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:19.036767   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:19.061847   54581 cri.go:89] found id: ""
	I1201 19:34:19.061861   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.061868   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:19.061873   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:19.061933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:19.095496   54581 cri.go:89] found id: ""
	I1201 19:34:19.095518   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.095525   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:19.095534   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:19.095544   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:19.160188   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:19.160209   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:19.171389   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:19.171411   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:19.237242   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:19.237253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:19.237273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:19.299987   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:19.300005   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:21.834525   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:21.845009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:21.845070   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:21.869831   54581 cri.go:89] found id: ""
	I1201 19:34:21.869848   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.869855   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:21.869863   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:21.869920   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:21.894806   54581 cri.go:89] found id: ""
	I1201 19:34:21.894819   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.894826   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:21.894831   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:21.894888   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:21.919467   54581 cri.go:89] found id: ""
	I1201 19:34:21.919481   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.919489   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:21.919494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:21.919557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:21.947371   54581 cri.go:89] found id: ""
	I1201 19:34:21.947384   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.947392   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:21.947397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:21.947466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:21.972455   54581 cri.go:89] found id: ""
	I1201 19:34:21.972469   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.972488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:21.972493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:21.972551   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:21.998955   54581 cri.go:89] found id: ""
	I1201 19:34:21.998969   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.998977   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:21.998982   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:21.999044   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:22.030320   54581 cri.go:89] found id: ""
	I1201 19:34:22.030348   54581 logs.go:282] 0 containers: []
	W1201 19:34:22.030356   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:22.030365   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:22.030378   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:22.091531   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:22.091549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:22.107258   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:22.107285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:22.185420   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:22.185431   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:22.185442   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:22.250849   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:22.250866   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:24.779249   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:24.792463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:24.792522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:24.817350   54581 cri.go:89] found id: ""
	I1201 19:34:24.817364   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.817371   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:24.817377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:24.817434   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:24.842191   54581 cri.go:89] found id: ""
	I1201 19:34:24.842205   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.842218   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:24.842224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:24.842284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:24.867478   54581 cri.go:89] found id: ""
	I1201 19:34:24.867492   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.867499   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:24.867505   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:24.867576   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:24.899422   54581 cri.go:89] found id: ""
	I1201 19:34:24.899436   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.899443   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:24.899452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:24.899509   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:24.934866   54581 cri.go:89] found id: ""
	I1201 19:34:24.934880   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.934887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:24.934893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:24.934956   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:24.959270   54581 cri.go:89] found id: ""
	I1201 19:34:24.959284   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.959291   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:24.959297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:24.959362   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:24.984211   54581 cri.go:89] found id: ""
	I1201 19:34:24.984224   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.984231   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:24.984239   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:24.984259   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:25.012471   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:25.012487   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:25.072643   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:25.072660   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:25.083552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:25.083571   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:25.160495   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:25.160504   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:25.160516   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:27.727176   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:27.737246   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:27.737307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:27.761343   54581 cri.go:89] found id: ""
	I1201 19:34:27.761357   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.761364   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:27.761370   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:27.761428   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:27.786257   54581 cri.go:89] found id: ""
	I1201 19:34:27.786276   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.786283   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:27.786288   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:27.786344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:27.810779   54581 cri.go:89] found id: ""
	I1201 19:34:27.810798   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.810807   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:27.810812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:27.810874   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:27.834773   54581 cri.go:89] found id: ""
	I1201 19:34:27.834792   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.834799   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:27.834804   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:27.834860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:27.862223   54581 cri.go:89] found id: ""
	I1201 19:34:27.862241   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.862248   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:27.862253   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:27.862307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:27.887279   54581 cri.go:89] found id: ""
	I1201 19:34:27.887292   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.887299   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:27.887305   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:27.887361   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:27.910821   54581 cri.go:89] found id: ""
	I1201 19:34:27.910834   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.910842   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:27.910849   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:27.910872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:27.920894   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:27.920909   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:27.982787   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:27.982797   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:27.982808   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:28.049448   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:28.049466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:28.083298   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:28.083315   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:30.648755   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:30.659054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:30.659115   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:30.683776   54581 cri.go:89] found id: ""
	I1201 19:34:30.683790   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.683797   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:30.683802   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:30.683858   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:30.708715   54581 cri.go:89] found id: ""
	I1201 19:34:30.708729   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.708736   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:30.708741   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:30.708801   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:30.732741   54581 cri.go:89] found id: ""
	I1201 19:34:30.732754   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.732761   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:30.732767   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:30.732821   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:30.762264   54581 cri.go:89] found id: ""
	I1201 19:34:30.762278   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.762284   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:30.762290   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:30.762353   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:30.789298   54581 cri.go:89] found id: ""
	I1201 19:34:30.789312   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.789319   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:30.789324   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:30.789381   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:30.814068   54581 cri.go:89] found id: ""
	I1201 19:34:30.814081   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.814089   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:30.814095   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:30.814157   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:30.841381   54581 cri.go:89] found id: ""
	I1201 19:34:30.841394   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.841402   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:30.841409   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:30.841431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:30.902920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:30.902931   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:30.902943   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:30.965009   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:30.965026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:30.993347   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:30.993370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:31.049258   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:31.049275   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.560996   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:33.571497   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:33.571557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:33.596875   54581 cri.go:89] found id: ""
	I1201 19:34:33.596889   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.596896   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:33.596901   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:33.596960   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:33.623640   54581 cri.go:89] found id: ""
	I1201 19:34:33.623653   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.623659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:33.623664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:33.623725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:33.647792   54581 cri.go:89] found id: ""
	I1201 19:34:33.647806   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.647814   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:33.647819   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:33.647882   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:33.672114   54581 cri.go:89] found id: ""
	I1201 19:34:33.672127   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.672134   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:33.672139   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:33.672197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:33.704799   54581 cri.go:89] found id: ""
	I1201 19:34:33.704812   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.704820   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:33.704825   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:33.704885   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:33.728981   54581 cri.go:89] found id: ""
	I1201 19:34:33.728995   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.729001   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:33.729006   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:33.729063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:33.756005   54581 cri.go:89] found id: ""
	I1201 19:34:33.756019   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.756027   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:33.756035   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:33.756046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:33.788420   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:33.788437   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:33.848036   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:33.848054   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.858909   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:33.858925   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:33.921156   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:33.921167   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:33.921178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.484434   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:36.494616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:36.494679   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:36.520017   54581 cri.go:89] found id: ""
	I1201 19:34:36.520031   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.520038   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:36.520044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:36.520100   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:36.545876   54581 cri.go:89] found id: ""
	I1201 19:34:36.545890   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.545897   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:36.545903   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:36.545966   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:36.571571   54581 cri.go:89] found id: ""
	I1201 19:34:36.571584   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.571591   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:36.571596   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:36.571653   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:36.596997   54581 cri.go:89] found id: ""
	I1201 19:34:36.597012   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.597019   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:36.597024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:36.597101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:36.623469   54581 cri.go:89] found id: ""
	I1201 19:34:36.623483   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.623491   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:36.623496   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:36.623556   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:36.651811   54581 cri.go:89] found id: ""
	I1201 19:34:36.651824   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.651831   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:36.651837   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:36.651893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:36.676659   54581 cri.go:89] found id: ""
	I1201 19:34:36.676673   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.676680   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:36.676688   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:36.676697   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:36.732392   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:36.732410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:36.743384   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:36.743400   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:36.805329   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:36.805338   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:36.805349   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.867566   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:36.867584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:39.402157   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:39.412161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:39.412220   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:39.439366   54581 cri.go:89] found id: ""
	I1201 19:34:39.439380   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.439387   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:39.439392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:39.439451   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:39.464076   54581 cri.go:89] found id: ""
	I1201 19:34:39.464090   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.464097   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:39.464108   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:39.464171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:39.488248   54581 cri.go:89] found id: ""
	I1201 19:34:39.488262   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.488270   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:39.488275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:39.488331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:39.517302   54581 cri.go:89] found id: ""
	I1201 19:34:39.517315   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.517322   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:39.517328   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:39.517385   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:39.542966   54581 cri.go:89] found id: ""
	I1201 19:34:39.542980   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.542986   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:39.542992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:39.543051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:39.568903   54581 cri.go:89] found id: ""
	I1201 19:34:39.568917   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.568924   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:39.568929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:39.568990   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:39.594057   54581 cri.go:89] found id: ""
	I1201 19:34:39.594069   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.594076   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:39.594084   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:39.594093   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:39.649679   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:39.649698   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:39.660114   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:39.660133   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:39.725472   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:39.725500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:39.725512   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:39.793738   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:39.793756   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:42.322742   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:42.333451   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:42.333536   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:42.368118   54581 cri.go:89] found id: ""
	I1201 19:34:42.368132   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.368139   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:42.368146   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:42.368217   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:42.402172   54581 cri.go:89] found id: ""
	I1201 19:34:42.402186   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.402193   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:42.402198   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:42.402266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:42.426759   54581 cri.go:89] found id: ""
	I1201 19:34:42.426772   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.426780   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:42.426785   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:42.426842   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:42.466084   54581 cri.go:89] found id: ""
	I1201 19:34:42.466097   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.466105   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:42.466110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:42.466168   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:42.490814   54581 cri.go:89] found id: ""
	I1201 19:34:42.490828   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.490835   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:42.490841   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:42.490899   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:42.516557   54581 cri.go:89] found id: ""
	I1201 19:34:42.516570   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.516578   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:42.516583   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:42.516651   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:42.542203   54581 cri.go:89] found id: ""
	I1201 19:34:42.542218   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.542224   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:42.542233   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:42.542243   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:42.599254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:42.599272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:42.610313   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:42.610328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:42.677502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:42.677514   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:42.677527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:42.751656   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:42.751683   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.281764   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:45.295929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:45.296004   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:45.355982   54581 cri.go:89] found id: ""
	I1201 19:34:45.356019   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.356027   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:45.356043   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:45.356214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:45.395974   54581 cri.go:89] found id: ""
	I1201 19:34:45.395987   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.396003   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:45.396008   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:45.396064   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:45.425011   54581 cri.go:89] found id: ""
	I1201 19:34:45.425027   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.425035   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:45.425041   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:45.425175   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:45.450304   54581 cri.go:89] found id: ""
	I1201 19:34:45.450317   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.450325   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:45.450330   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:45.450399   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:45.480282   54581 cri.go:89] found id: ""
	I1201 19:34:45.480296   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.480302   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:45.480307   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:45.480376   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:45.511012   54581 cri.go:89] found id: ""
	I1201 19:34:45.511026   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.511033   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:45.511039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:45.511101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:45.536767   54581 cri.go:89] found id: ""
	I1201 19:34:45.536781   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.536797   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:45.536806   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:45.536818   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:45.547801   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:45.547822   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:45.615408   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:45.615424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:45.615434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:45.679022   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:45.679041   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.711030   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:45.711049   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.268349   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:48.279339   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:48.279398   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:48.304817   54581 cri.go:89] found id: ""
	I1201 19:34:48.304831   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.304839   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:48.304844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:48.304905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:48.329897   54581 cri.go:89] found id: ""
	I1201 19:34:48.329911   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.329919   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:48.329924   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:48.329982   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:48.369087   54581 cri.go:89] found id: ""
	I1201 19:34:48.369100   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.369107   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:48.369112   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:48.369169   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:48.400882   54581 cri.go:89] found id: ""
	I1201 19:34:48.400896   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.400903   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:48.400909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:48.400965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:48.426896   54581 cri.go:89] found id: ""
	I1201 19:34:48.426912   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.426920   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:48.426925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:48.426987   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:48.455956   54581 cri.go:89] found id: ""
	I1201 19:34:48.455969   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.455987   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:48.455994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:48.456051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:48.480640   54581 cri.go:89] found id: ""
	I1201 19:34:48.480653   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.480671   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:48.480679   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:48.480690   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.536591   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:48.536609   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:48.547466   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:48.547482   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:48.620325   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:48.620335   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:48.620345   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:48.683388   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:48.683407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:51.214144   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:51.224292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:51.224364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:51.247923   54581 cri.go:89] found id: ""
	I1201 19:34:51.247937   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.247945   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:51.247952   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:51.248011   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:51.273984   54581 cri.go:89] found id: ""
	I1201 19:34:51.273998   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.274005   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:51.274011   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:51.274072   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:51.298775   54581 cri.go:89] found id: ""
	I1201 19:34:51.298789   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.298796   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:51.298801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:51.298860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:51.326553   54581 cri.go:89] found id: ""
	I1201 19:34:51.326567   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.326574   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:51.326580   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:51.326639   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:51.360945   54581 cri.go:89] found id: ""
	I1201 19:34:51.360959   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.360987   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:51.360992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:51.361059   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:51.396255   54581 cri.go:89] found id: ""
	I1201 19:34:51.396282   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.396290   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:51.396296   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:51.396369   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:51.427687   54581 cri.go:89] found id: ""
	I1201 19:34:51.427700   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.427707   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:51.427715   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:51.427734   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:51.483915   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:51.483934   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:51.495247   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:51.495271   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:51.559547   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:51.559558   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:51.559568   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:51.623141   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:51.623161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:54.157001   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:54.170439   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:54.170498   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:54.203772   54581 cri.go:89] found id: ""
	I1201 19:34:54.203785   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.203792   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:54.203798   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:54.203854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:54.231733   54581 cri.go:89] found id: ""
	I1201 19:34:54.231747   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.231754   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:54.231759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:54.231817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:54.256716   54581 cri.go:89] found id: ""
	I1201 19:34:54.256739   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.256746   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:54.256752   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:54.256817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:54.281376   54581 cri.go:89] found id: ""
	I1201 19:34:54.281390   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.281407   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:54.281413   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:54.281469   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:54.305969   54581 cri.go:89] found id: ""
	I1201 19:34:54.305982   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.305989   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:54.305994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:54.306049   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:54.330385   54581 cri.go:89] found id: ""
	I1201 19:34:54.330399   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.330406   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:54.330422   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:54.330478   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:54.358455   54581 cri.go:89] found id: ""
	I1201 19:34:54.358478   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.358489   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:54.358497   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:54.358508   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:54.422783   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:54.422804   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:54.434139   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:54.434153   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:54.499665   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:54.499677   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:54.499689   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:54.562594   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:54.562614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:57.093944   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:57.104140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:57.104207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:57.129578   54581 cri.go:89] found id: ""
	I1201 19:34:57.129590   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.129597   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:57.129603   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:57.129663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:57.153119   54581 cri.go:89] found id: ""
	I1201 19:34:57.153133   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.153140   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:57.153145   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:57.153202   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:57.178134   54581 cri.go:89] found id: ""
	I1201 19:34:57.178148   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.178155   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:57.178161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:57.178222   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:57.208559   54581 cri.go:89] found id: ""
	I1201 19:34:57.208572   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.208579   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:57.208585   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:57.208642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:57.232807   54581 cri.go:89] found id: ""
	I1201 19:34:57.232821   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.232838   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:57.232844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:57.232898   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:57.257939   54581 cri.go:89] found id: ""
	I1201 19:34:57.257952   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.257959   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:57.257964   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:57.258022   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:57.283855   54581 cri.go:89] found id: ""
	I1201 19:34:57.283869   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.283875   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:57.283883   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:57.283893   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:57.340764   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:57.340781   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:57.352935   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:57.352949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:57.427562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:57.427571   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:57.427581   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:57.490526   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:57.490553   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.020694   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:00.036199   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:00.036266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:00.146207   54581 cri.go:89] found id: ""
	I1201 19:35:00.146226   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.146234   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:00.146241   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:00.146319   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:00.271439   54581 cri.go:89] found id: ""
	I1201 19:35:00.271454   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.271462   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:00.271468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:00.271541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:00.365096   54581 cri.go:89] found id: ""
	I1201 19:35:00.365111   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.365119   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:00.365124   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:00.365190   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:00.419095   54581 cri.go:89] found id: ""
	I1201 19:35:00.419109   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.419116   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:00.419123   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:00.419184   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:00.457455   54581 cri.go:89] found id: ""
	I1201 19:35:00.457470   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.457478   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:00.457507   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:00.457577   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:00.503679   54581 cri.go:89] found id: ""
	I1201 19:35:00.503694   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.503701   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:00.503710   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:00.503803   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:00.546120   54581 cri.go:89] found id: ""
	I1201 19:35:00.546135   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.546142   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:00.546151   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:00.546164   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:00.559836   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:00.559853   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:00.634650   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:00.634660   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:00.634675   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:00.700259   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:00.700278   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.733345   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:00.733363   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.295407   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:03.305664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:03.305725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:03.330370   54581 cri.go:89] found id: ""
	I1201 19:35:03.330385   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.330392   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:03.330397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:03.330452   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:03.356109   54581 cri.go:89] found id: ""
	I1201 19:35:03.356123   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.356130   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:03.356135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:03.356198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:03.382338   54581 cri.go:89] found id: ""
	I1201 19:35:03.382352   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.382360   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:03.382366   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:03.382423   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:03.414550   54581 cri.go:89] found id: ""
	I1201 19:35:03.414564   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.414571   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:03.414577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:03.414633   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:03.438540   54581 cri.go:89] found id: ""
	I1201 19:35:03.438553   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.438560   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:03.438565   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:03.438623   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:03.463113   54581 cri.go:89] found id: ""
	I1201 19:35:03.463127   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.463134   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:03.463140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:03.463204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:03.487632   54581 cri.go:89] found id: ""
	I1201 19:35:03.487645   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.487653   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:03.487660   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:03.487670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.544515   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:03.544536   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:03.555787   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:03.555803   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:03.627256   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:03.627266   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:03.627276   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:03.691235   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:03.691254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.220125   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:06.230749   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:06.230813   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:06.255951   54581 cri.go:89] found id: ""
	I1201 19:35:06.255965   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.255972   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:06.255977   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:06.256034   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:06.281528   54581 cri.go:89] found id: ""
	I1201 19:35:06.281542   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.281549   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:06.281554   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:06.281613   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:06.306502   54581 cri.go:89] found id: ""
	I1201 19:35:06.306515   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.306522   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:06.306527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:06.306590   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:06.337726   54581 cri.go:89] found id: ""
	I1201 19:35:06.337739   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.337745   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:06.337751   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:06.337810   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:06.367682   54581 cri.go:89] found id: ""
	I1201 19:35:06.367696   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.367713   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:06.367726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:06.367793   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:06.397675   54581 cri.go:89] found id: ""
	I1201 19:35:06.397690   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.397707   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:06.397713   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:06.397778   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:06.424426   54581 cri.go:89] found id: ""
	I1201 19:35:06.424439   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.424452   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:06.424460   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:06.424471   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:06.435325   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:06.435340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:06.499920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:06.499942   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:06.499952   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:06.564348   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:06.564367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.592906   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:06.592921   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:09.151061   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:09.161179   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:09.161240   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:09.186739   54581 cri.go:89] found id: ""
	I1201 19:35:09.186752   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.186759   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:09.186765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:09.186822   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:09.211245   54581 cri.go:89] found id: ""
	I1201 19:35:09.211259   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.211267   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:09.211273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:09.211336   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:09.239043   54581 cri.go:89] found id: ""
	I1201 19:35:09.239056   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.239063   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:09.239068   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:09.239125   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:09.264055   54581 cri.go:89] found id: ""
	I1201 19:35:09.264068   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.264076   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:09.264081   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:09.264137   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:09.288509   54581 cri.go:89] found id: ""
	I1201 19:35:09.288522   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.288529   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:09.288536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:09.288593   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:09.312763   54581 cri.go:89] found id: ""
	I1201 19:35:09.312777   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.312784   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:09.312789   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:09.312851   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:09.344164   54581 cri.go:89] found id: ""
	I1201 19:35:09.344177   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.344184   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:09.344192   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:09.344203   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:09.356120   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:09.356134   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:09.428320   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:09.428329   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:09.428339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:09.491282   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:09.491301   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:09.518473   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:09.518488   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.081815   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:12.092336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:12.092400   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:12.117269   54581 cri.go:89] found id: ""
	I1201 19:35:12.117284   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.117291   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:12.117297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:12.117355   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:12.141885   54581 cri.go:89] found id: ""
	I1201 19:35:12.141898   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.141904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:12.141909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:12.141968   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:12.166386   54581 cri.go:89] found id: ""
	I1201 19:35:12.166400   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.166407   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:12.166411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:12.166479   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:12.190615   54581 cri.go:89] found id: ""
	I1201 19:35:12.190628   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.190636   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:12.190641   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:12.190701   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:12.219887   54581 cri.go:89] found id: ""
	I1201 19:35:12.219900   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.219907   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:12.219912   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:12.219970   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:12.244718   54581 cri.go:89] found id: ""
	I1201 19:35:12.244731   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.244738   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:12.244743   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:12.244802   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:12.272273   54581 cri.go:89] found id: ""
	I1201 19:35:12.272287   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.272294   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:12.272301   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:12.272312   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.329315   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:12.329334   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:12.343015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:12.343032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:12.419939   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:12.419949   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:12.419960   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:12.482187   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:12.482205   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:15.011802   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:15.022432   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:15.022499   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:15.094886   54581 cri.go:89] found id: ""
	I1201 19:35:15.094901   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.094909   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:15.094915   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:15.094978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:15.120839   54581 cri.go:89] found id: ""
	I1201 19:35:15.120853   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.120860   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:15.120865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:15.120927   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:15.150767   54581 cri.go:89] found id: ""
	I1201 19:35:15.150781   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.150795   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:15.150801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:15.150867   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:15.177630   54581 cri.go:89] found id: ""
	I1201 19:35:15.177644   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.177651   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:15.177656   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:15.177727   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:15.203467   54581 cri.go:89] found id: ""
	I1201 19:35:15.203480   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.203498   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:15.203504   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:15.203563   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:15.229010   54581 cri.go:89] found id: ""
	I1201 19:35:15.229023   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.229031   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:15.229036   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:15.229128   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:15.254029   54581 cri.go:89] found id: ""
	I1201 19:35:15.254043   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.254051   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:15.254058   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:15.254068   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:15.309931   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:15.309949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:15.320452   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:15.320466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:15.413158   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:15.413169   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:15.413180   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:15.475409   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:15.475428   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:18.004450   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:18.015126   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:18.015185   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:18.046344   54581 cri.go:89] found id: ""
	I1201 19:35:18.046359   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.046366   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:18.046373   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:18.046436   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:18.074519   54581 cri.go:89] found id: ""
	I1201 19:35:18.074532   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.074539   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:18.074545   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:18.074603   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:18.103787   54581 cri.go:89] found id: ""
	I1201 19:35:18.103801   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.103808   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:18.103814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:18.103869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:18.130363   54581 cri.go:89] found id: ""
	I1201 19:35:18.130377   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.130384   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:18.130390   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:18.130449   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:18.155589   54581 cri.go:89] found id: ""
	I1201 19:35:18.155616   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.155625   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:18.155630   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:18.155699   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:18.180628   54581 cri.go:89] found id: ""
	I1201 19:35:18.180641   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.180648   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:18.180654   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:18.180711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:18.205996   54581 cri.go:89] found id: ""
	I1201 19:35:18.206026   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.206033   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:18.206041   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:18.206051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:18.260718   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:18.260736   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:18.271842   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:18.271858   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:18.342769   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:18.342780   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:18.342793   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:18.423726   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:18.423744   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:20.954199   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:20.964087   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:20.964143   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:20.987490   54581 cri.go:89] found id: ""
	I1201 19:35:20.987504   54581 logs.go:282] 0 containers: []
	W1201 19:35:20.987510   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:20.987516   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:20.987572   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:21.012114   54581 cri.go:89] found id: ""
	I1201 19:35:21.012128   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.012135   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:21.012140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:21.012201   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:21.037730   54581 cri.go:89] found id: ""
	I1201 19:35:21.037744   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.037751   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:21.037756   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:21.037815   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:21.062445   54581 cri.go:89] found id: ""
	I1201 19:35:21.062458   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.062465   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:21.062471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:21.062529   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:21.086847   54581 cri.go:89] found id: ""
	I1201 19:35:21.086860   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.086867   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:21.086872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:21.086930   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:21.111866   54581 cri.go:89] found id: ""
	I1201 19:35:21.111880   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.111886   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:21.111892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:21.111948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:21.136296   54581 cri.go:89] found id: ""
	I1201 19:35:21.136311   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.136318   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:21.136326   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:21.136343   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:21.200999   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:21.201009   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:21.201020   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:21.265838   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:21.265857   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:21.296214   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:21.296230   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:21.354254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:21.354272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:23.868647   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:23.879143   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:23.879205   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:23.907613   54581 cri.go:89] found id: ""
	I1201 19:35:23.907633   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.907640   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:23.907645   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:23.907705   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:23.932767   54581 cri.go:89] found id: ""
	I1201 19:35:23.932781   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.932787   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:23.932793   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:23.932849   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:23.961305   54581 cri.go:89] found id: ""
	I1201 19:35:23.961319   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.961326   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:23.961331   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:23.961387   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:23.986651   54581 cri.go:89] found id: ""
	I1201 19:35:23.986664   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.986670   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:23.986676   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:23.986734   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:24.011204   54581 cri.go:89] found id: ""
	I1201 19:35:24.011218   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.011225   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:24.011230   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:24.011286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:24.040784   54581 cri.go:89] found id: ""
	I1201 19:35:24.040798   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.040806   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:24.040812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:24.040871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:24.067432   54581 cri.go:89] found id: ""
	I1201 19:35:24.067446   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.067453   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:24.067461   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:24.067472   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:24.132929   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:24.132946   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:24.132956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:24.194894   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:24.194912   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:24.225351   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:24.225366   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:24.282142   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:24.282161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:26.793143   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:26.803454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:26.803518   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:26.827433   54581 cri.go:89] found id: ""
	I1201 19:35:26.827447   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.827454   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:26.827459   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:26.827514   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:26.851666   54581 cri.go:89] found id: ""
	I1201 19:35:26.851680   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.851686   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:26.851691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:26.851749   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:26.880353   54581 cri.go:89] found id: ""
	I1201 19:35:26.880367   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.880374   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:26.880379   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:26.880437   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:26.908944   54581 cri.go:89] found id: ""
	I1201 19:35:26.908957   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.908964   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:26.908969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:26.909025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:26.933983   54581 cri.go:89] found id: ""
	I1201 19:35:26.933996   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.934003   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:26.934009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:26.934069   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:26.958791   54581 cri.go:89] found id: ""
	I1201 19:35:26.958805   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.958812   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:26.958818   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:26.958878   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:26.983156   54581 cri.go:89] found id: ""
	I1201 19:35:26.983170   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.983177   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:26.983185   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:26.983200   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:27.038997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:27.039015   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:27.050299   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:27.050314   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:27.113733   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:27.113744   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:27.113754   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:27.176267   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:27.176285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.706128   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:29.716285   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:29.716344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:29.741420   54581 cri.go:89] found id: ""
	I1201 19:35:29.741435   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.741442   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:29.741447   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:29.741545   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:29.766524   54581 cri.go:89] found id: ""
	I1201 19:35:29.766538   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.766545   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:29.766550   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:29.766616   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:29.795421   54581 cri.go:89] found id: ""
	I1201 19:35:29.795434   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.795441   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:29.795446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:29.795511   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:29.821121   54581 cri.go:89] found id: ""
	I1201 19:35:29.821135   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.821142   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:29.821147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:29.821204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:29.849641   54581 cri.go:89] found id: ""
	I1201 19:35:29.849654   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.849662   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:29.849667   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:29.849724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:29.874049   54581 cri.go:89] found id: ""
	I1201 19:35:29.874063   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.874069   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:29.874075   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:29.874136   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:29.897867   54581 cri.go:89] found id: ""
	I1201 19:35:29.897880   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.897887   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:29.897895   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:29.897905   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:29.959029   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:29.959046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.991283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:29.991298   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:30.051265   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:30.051286   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:30.082322   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:30.082339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:30.173300   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:32.673672   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:32.683965   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:32.684023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:32.712191   54581 cri.go:89] found id: ""
	I1201 19:35:32.712204   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.712211   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:32.712216   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:32.712275   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:32.739246   54581 cri.go:89] found id: ""
	I1201 19:35:32.739259   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.739266   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:32.739272   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:32.739331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:32.763898   54581 cri.go:89] found id: ""
	I1201 19:35:32.763911   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.763924   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:32.763929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:32.763989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:32.789967   54581 cri.go:89] found id: ""
	I1201 19:35:32.789990   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.789997   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:32.790004   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:32.790063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:32.816013   54581 cri.go:89] found id: ""
	I1201 19:35:32.816028   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.816035   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:32.816040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:32.816098   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:32.839560   54581 cri.go:89] found id: ""
	I1201 19:35:32.839573   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.839580   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:32.839586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:32.839644   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:32.868062   54581 cri.go:89] found id: ""
	I1201 19:35:32.868075   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.868082   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:32.868090   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:32.868099   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:32.923266   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:32.923285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:32.934015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:32.934030   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:33.005502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:33.005512   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:33.005523   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:33.075965   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:33.075984   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.605628   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:35.617054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:35.617126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:35.646998   54581 cri.go:89] found id: ""
	I1201 19:35:35.647012   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.647019   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:35.647025   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:35.647086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:35.676130   54581 cri.go:89] found id: ""
	I1201 19:35:35.676143   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.676150   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:35.676155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:35.676211   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:35.700589   54581 cri.go:89] found id: ""
	I1201 19:35:35.700602   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.700609   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:35.700616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:35.700672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:35.725233   54581 cri.go:89] found id: ""
	I1201 19:35:35.725246   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.725253   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:35.725273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:35.725343   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:35.750382   54581 cri.go:89] found id: ""
	I1201 19:35:35.750396   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.750403   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:35.750408   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:35.750462   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:35.775219   54581 cri.go:89] found id: ""
	I1201 19:35:35.775235   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.775243   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:35.775248   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:35.775320   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:35.800831   54581 cri.go:89] found id: ""
	I1201 19:35:35.800845   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.800852   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:35.800859   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:35.800870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:35.866740   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:35.866756   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:35.866767   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:35.931013   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:35.931031   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.958721   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:35.958743   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:36.015847   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:36.015863   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.535518   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:38.545931   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:38.545993   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:38.571083   54581 cri.go:89] found id: ""
	I1201 19:35:38.571097   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.571104   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:38.571109   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:38.571170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:38.608738   54581 cri.go:89] found id: ""
	I1201 19:35:38.608752   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.608759   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:38.608765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:38.608820   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:38.635605   54581 cri.go:89] found id: ""
	I1201 19:35:38.635619   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.635626   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:38.635631   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:38.635689   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:38.668134   54581 cri.go:89] found id: ""
	I1201 19:35:38.668147   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.668155   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:38.668172   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:38.668231   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:38.693505   54581 cri.go:89] found id: ""
	I1201 19:35:38.693519   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.693526   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:38.693531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:38.693602   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:38.719017   54581 cri.go:89] found id: ""
	I1201 19:35:38.719031   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.719039   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:38.719044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:38.719103   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:38.748727   54581 cri.go:89] found id: ""
	I1201 19:35:38.748740   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.748747   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:38.748754   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:38.748765   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:38.778021   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:38.778037   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:38.838504   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:38.838524   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.851587   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:38.851603   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:38.919080   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:38.919115   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:38.919130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.484602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:41.495239   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:41.495298   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:41.525151   54581 cri.go:89] found id: ""
	I1201 19:35:41.525165   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.525172   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:41.525191   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:41.525256   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:41.551287   54581 cri.go:89] found id: ""
	I1201 19:35:41.551301   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.551309   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:41.551329   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:41.551392   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:41.577108   54581 cri.go:89] found id: ""
	I1201 19:35:41.577124   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.577131   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:41.577136   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:41.577204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:41.613970   54581 cri.go:89] found id: ""
	I1201 19:35:41.613983   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.613991   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:41.614005   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:41.614063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:41.647948   54581 cri.go:89] found id: ""
	I1201 19:35:41.647961   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.647968   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:41.647973   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:41.648038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:41.675741   54581 cri.go:89] found id: ""
	I1201 19:35:41.675754   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.675761   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:41.675770   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:41.675827   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:41.701031   54581 cri.go:89] found id: ""
	I1201 19:35:41.701053   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.701061   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:41.701068   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:41.701079   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:41.712066   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:41.712081   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:41.774820   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:41.774852   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:41.774864   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.837237   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:41.837254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:41.867407   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:41.867423   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.425417   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:44.436694   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:44.436764   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:44.462550   54581 cri.go:89] found id: ""
	I1201 19:35:44.462565   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.462571   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:44.462577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:44.462634   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:44.490237   54581 cri.go:89] found id: ""
	I1201 19:35:44.490250   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.490257   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:44.490262   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:44.490318   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:44.517417   54581 cri.go:89] found id: ""
	I1201 19:35:44.517431   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.517438   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:44.517443   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:44.517523   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:44.542502   54581 cri.go:89] found id: ""
	I1201 19:35:44.542516   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.542523   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:44.542528   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:44.542588   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:44.568636   54581 cri.go:89] found id: ""
	I1201 19:35:44.568650   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.568682   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:44.568688   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:44.568756   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:44.602872   54581 cri.go:89] found id: ""
	I1201 19:35:44.602891   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.602898   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:44.602904   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:44.602961   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:44.633265   54581 cri.go:89] found id: ""
	I1201 19:35:44.633280   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.633287   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:44.633295   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:44.633305   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:44.704029   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:44.704040   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:44.704051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:44.768055   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:44.768075   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:44.797083   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:44.797098   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.852537   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:44.852555   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.364630   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:47.374921   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:47.374978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:47.399587   54581 cri.go:89] found id: ""
	I1201 19:35:47.399600   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.399607   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:47.399613   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:47.399672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:47.426120   54581 cri.go:89] found id: ""
	I1201 19:35:47.426134   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.426141   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:47.426147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:47.426227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:47.457662   54581 cri.go:89] found id: ""
	I1201 19:35:47.457676   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.457683   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:47.457689   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:47.457747   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:47.482682   54581 cri.go:89] found id: ""
	I1201 19:35:47.482702   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.482709   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:47.482728   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:47.482796   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:47.511319   54581 cri.go:89] found id: ""
	I1201 19:35:47.511334   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.511341   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:47.511346   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:47.511409   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:47.543730   54581 cri.go:89] found id: ""
	I1201 19:35:47.543742   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.543760   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:47.543765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:47.543831   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:47.572333   54581 cri.go:89] found id: ""
	I1201 19:35:47.572347   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.572355   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:47.572363   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:47.572385   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:47.637165   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:47.637184   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.648940   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:47.648956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:47.711651   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:47.711662   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:47.711681   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:47.773144   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:47.773163   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:50.303086   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:50.313234   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:50.313293   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:50.337483   54581 cri.go:89] found id: ""
	I1201 19:35:50.337515   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.337522   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:50.337527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:50.337583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:50.363911   54581 cri.go:89] found id: ""
	I1201 19:35:50.363927   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.363934   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:50.363939   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:50.363994   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:50.388359   54581 cri.go:89] found id: ""
	I1201 19:35:50.388373   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.388380   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:50.388386   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:50.388441   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:50.412983   54581 cri.go:89] found id: ""
	I1201 19:35:50.412996   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.413003   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:50.413014   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:50.413073   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:50.440996   54581 cri.go:89] found id: ""
	I1201 19:35:50.441017   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.441024   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:50.441030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:50.441085   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:50.467480   54581 cri.go:89] found id: ""
	I1201 19:35:50.467493   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.467501   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:50.467506   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:50.467567   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:50.494388   54581 cri.go:89] found id: ""
	I1201 19:35:50.494402   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.494409   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:50.494416   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:50.494427   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:50.550339   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:50.550359   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:50.561242   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:50.561258   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:50.633849   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:50.633860   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:50.633870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:50.702260   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:50.702280   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:53.234959   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:53.245018   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:53.245083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:53.276399   54581 cri.go:89] found id: ""
	I1201 19:35:53.276413   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.276420   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:53.276425   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:53.276491   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:53.305853   54581 cri.go:89] found id: ""
	I1201 19:35:53.305866   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.305873   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:53.305878   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:53.305935   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:53.335241   54581 cri.go:89] found id: ""
	I1201 19:35:53.335255   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.335263   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:53.335269   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:53.335328   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:53.359467   54581 cri.go:89] found id: ""
	I1201 19:35:53.359481   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.359488   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:53.359493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:53.359550   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:53.384120   54581 cri.go:89] found id: ""
	I1201 19:35:53.384134   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.384141   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:53.384147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:53.384203   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:53.414128   54581 cri.go:89] found id: ""
	I1201 19:35:53.414141   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.414149   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:53.414155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:53.414214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:53.439408   54581 cri.go:89] found id: ""
	I1201 19:35:53.439421   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.439428   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:53.439436   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:53.439446   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:53.495007   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:53.495026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:53.505932   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:53.505948   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:53.572678   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:53.572688   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:53.572702   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:53.650600   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:53.650621   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:56.183319   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:56.193782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:56.193843   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:56.224114   54581 cri.go:89] found id: ""
	I1201 19:35:56.224128   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.224135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:56.224140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:56.224197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:56.254013   54581 cri.go:89] found id: ""
	I1201 19:35:56.254027   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.254034   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:56.254040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:56.254102   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:56.279886   54581 cri.go:89] found id: ""
	I1201 19:35:56.279900   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.279908   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:56.279914   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:56.279976   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:56.304943   54581 cri.go:89] found id: ""
	I1201 19:35:56.304956   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.304963   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:56.304969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:56.305025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:56.328633   54581 cri.go:89] found id: ""
	I1201 19:35:56.328647   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.328654   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:56.328659   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:56.328715   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:56.357255   54581 cri.go:89] found id: ""
	I1201 19:35:56.357269   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.357276   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:56.357281   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:56.357340   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:56.381420   54581 cri.go:89] found id: ""
	I1201 19:35:56.381434   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.381441   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:56.381449   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:56.381459   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:56.439709   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:56.439728   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:56.450590   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:56.450605   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:56.516412   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:56.516423   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:56.516435   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:56.577800   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:56.577828   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.114477   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:59.124117   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:59.124179   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:59.151351   54581 cri.go:89] found id: ""
	I1201 19:35:59.151364   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.151372   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:59.151377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:59.151433   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:59.179997   54581 cri.go:89] found id: ""
	I1201 19:35:59.180010   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.180017   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:59.180022   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:59.180084   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:59.204818   54581 cri.go:89] found id: ""
	I1201 19:35:59.204832   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.204859   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:59.204864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:59.204923   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:59.230443   54581 cri.go:89] found id: ""
	I1201 19:35:59.230456   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.230464   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:59.230470   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:59.230524   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:59.254548   54581 cri.go:89] found id: ""
	I1201 19:35:59.254561   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.254569   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:59.254574   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:59.254629   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:59.282564   54581 cri.go:89] found id: ""
	I1201 19:35:59.282577   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.282584   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:59.282590   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:59.282645   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:59.310544   54581 cri.go:89] found id: ""
	I1201 19:35:59.310557   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.310565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:59.310573   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:59.310587   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:59.377012   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:59.377021   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:59.377032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:59.441479   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:59.441511   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.471908   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:59.471924   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:59.527613   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:59.527631   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.040294   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:02.051787   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:02.051869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:02.077788   54581 cri.go:89] found id: ""
	I1201 19:36:02.077801   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.077808   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:02.077814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:02.077871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:02.103346   54581 cri.go:89] found id: ""
	I1201 19:36:02.103359   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.103366   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:02.103371   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:02.103427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:02.128949   54581 cri.go:89] found id: ""
	I1201 19:36:02.128963   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.128970   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:02.128975   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:02.129033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:02.153585   54581 cri.go:89] found id: ""
	I1201 19:36:02.153598   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.153605   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:02.153611   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:02.153668   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:02.180499   54581 cri.go:89] found id: ""
	I1201 19:36:02.180513   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.180520   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:02.180531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:02.180592   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:02.206116   54581 cri.go:89] found id: ""
	I1201 19:36:02.206131   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.206138   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:02.206144   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:02.206210   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:02.232470   54581 cri.go:89] found id: ""
	I1201 19:36:02.232484   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.232492   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:02.232500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:02.232513   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:02.295347   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:02.295367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:02.323002   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:02.323018   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:02.382028   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:02.382046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.393159   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:02.393176   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:02.457522   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:04.957729   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:04.967951   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:04.968012   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:04.993754   54581 cri.go:89] found id: ""
	I1201 19:36:04.993769   54581 logs.go:282] 0 containers: []
	W1201 19:36:04.993776   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:04.993782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:04.993844   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:05.019859   54581 cri.go:89] found id: ""
	I1201 19:36:05.019873   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.019881   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:05.019886   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:05.019943   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:05.047016   54581 cri.go:89] found id: ""
	I1201 19:36:05.047031   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.047038   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:05.047046   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:05.047107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:05.072292   54581 cri.go:89] found id: ""
	I1201 19:36:05.072306   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.072313   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:05.072318   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:05.072377   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:05.099842   54581 cri.go:89] found id: ""
	I1201 19:36:05.099857   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.099864   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:05.099870   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:05.099926   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:05.125552   54581 cri.go:89] found id: ""
	I1201 19:36:05.125566   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.125573   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:05.125579   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:05.125635   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:05.150637   54581 cri.go:89] found id: ""
	I1201 19:36:05.150651   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.150659   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:05.150667   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:05.150677   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:05.218391   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:05.218410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:05.246651   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:05.246670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:05.303677   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:05.303694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:05.314794   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:05.314809   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:05.380077   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:07.881622   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:07.893048   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:07.893109   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:07.918109   54581 cri.go:89] found id: ""
	I1201 19:36:07.918122   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.918129   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:07.918134   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:07.918196   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:07.943504   54581 cri.go:89] found id: ""
	I1201 19:36:07.943518   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.943525   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:07.943536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:07.943595   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:07.969943   54581 cri.go:89] found id: ""
	I1201 19:36:07.969958   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.969965   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:07.969971   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:07.970033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:07.994994   54581 cri.go:89] found id: ""
	I1201 19:36:07.995009   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.995015   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:07.995021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:07.995083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:08.020591   54581 cri.go:89] found id: ""
	I1201 19:36:08.020605   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.020612   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:08.020617   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:08.020676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:08.053041   54581 cri.go:89] found id: ""
	I1201 19:36:08.053056   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.053063   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:08.053069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:08.053129   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:08.084333   54581 cri.go:89] found id: ""
	I1201 19:36:08.084346   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.084353   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:08.084361   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:08.084371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:08.099534   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:08.099551   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:08.163985   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:08.163995   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:08.164006   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:08.224823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:08.224840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:08.256602   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:08.256618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:10.818842   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:10.829650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:10.829713   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:10.866261   54581 cri.go:89] found id: ""
	I1201 19:36:10.866275   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.866293   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:10.866299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:10.866378   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:10.902129   54581 cri.go:89] found id: ""
	I1201 19:36:10.902157   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.902166   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:10.902171   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:10.902287   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:10.935780   54581 cri.go:89] found id: ""
	I1201 19:36:10.935796   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.935803   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:10.935809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:10.935868   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:10.961965   54581 cri.go:89] found id: ""
	I1201 19:36:10.961979   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.961987   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:10.961993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:10.962050   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:10.988752   54581 cri.go:89] found id: ""
	I1201 19:36:10.988765   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.988772   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:10.988778   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:10.988855   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:11.013768   54581 cri.go:89] found id: ""
	I1201 19:36:11.013783   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.013790   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:11.013795   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:11.013852   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:11.039944   54581 cri.go:89] found id: ""
	I1201 19:36:11.039959   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.039982   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:11.039992   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:11.040003   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:11.096281   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:11.096300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:11.107964   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:11.107989   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:11.174240   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:11.174253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:11.174265   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:11.240383   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:11.240406   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.770524   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:13.780691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:13.780754   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:13.805306   54581 cri.go:89] found id: ""
	I1201 19:36:13.805321   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.805328   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:13.805333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:13.805390   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:13.830209   54581 cri.go:89] found id: ""
	I1201 19:36:13.830223   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.830229   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:13.830235   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:13.830294   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:13.859814   54581 cri.go:89] found id: ""
	I1201 19:36:13.859827   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.859834   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:13.859839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:13.859905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:13.888545   54581 cri.go:89] found id: ""
	I1201 19:36:13.888559   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.888567   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:13.888573   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:13.888642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:13.918445   54581 cri.go:89] found id: ""
	I1201 19:36:13.918459   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.918466   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:13.918471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:13.918530   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:13.944112   54581 cri.go:89] found id: ""
	I1201 19:36:13.944125   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.944132   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:13.944147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:13.944206   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:13.969842   54581 cri.go:89] found id: ""
	I1201 19:36:13.969856   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.969863   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:13.969872   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:13.969882   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.999132   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:13.999150   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:14.056959   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:14.056979   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:14.068288   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:14.068304   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:14.137988   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:14.137997   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:14.138008   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:16.704768   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:16.715111   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:16.715170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:16.740051   54581 cri.go:89] found id: ""
	I1201 19:36:16.740065   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.740072   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:16.740078   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:16.740150   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:16.765291   54581 cri.go:89] found id: ""
	I1201 19:36:16.765309   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.765317   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:16.765323   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:16.765380   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:16.790212   54581 cri.go:89] found id: ""
	I1201 19:36:16.790226   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.790233   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:16.790238   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:16.790297   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:16.814700   54581 cri.go:89] found id: ""
	I1201 19:36:16.814714   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.814721   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:16.814726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:16.814785   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:16.851986   54581 cri.go:89] found id: ""
	I1201 19:36:16.852000   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.852007   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:16.852012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:16.852067   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:16.883217   54581 cri.go:89] found id: ""
	I1201 19:36:16.883231   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.883237   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:16.883243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:16.883301   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:16.922552   54581 cri.go:89] found id: ""
	I1201 19:36:16.922566   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.922574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:16.922582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:16.922591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:16.982282   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:16.982300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:16.993387   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:16.993401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:17.063398   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:17.063409   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:17.063421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:17.125575   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:17.125594   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.654741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:19.665378   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:19.665445   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:19.690531   54581 cri.go:89] found id: ""
	I1201 19:36:19.690545   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.690553   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:19.690559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:19.690617   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:19.715409   54581 cri.go:89] found id: ""
	I1201 19:36:19.715423   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.715431   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:19.715436   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:19.715494   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:19.743995   54581 cri.go:89] found id: ""
	I1201 19:36:19.744009   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.744016   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:19.744021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:19.744078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:19.769191   54581 cri.go:89] found id: ""
	I1201 19:36:19.769204   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.769212   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:19.769217   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:19.769286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:19.793617   54581 cri.go:89] found id: ""
	I1201 19:36:19.793631   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.793638   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:19.793644   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:19.793704   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:19.818818   54581 cri.go:89] found id: ""
	I1201 19:36:19.818832   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.818840   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:19.818845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:19.818914   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:19.852332   54581 cri.go:89] found id: ""
	I1201 19:36:19.852346   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.852368   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:19.852378   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:19.852389   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.884627   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:19.884642   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:19.947006   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:19.947026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:19.958524   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:19.958539   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:20.040965   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:20.040976   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:20.040988   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:22.622750   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:22.637572   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:22.637637   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:22.667700   54581 cri.go:89] found id: ""
	I1201 19:36:22.667714   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.667721   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:22.667727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:22.667786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:22.700758   54581 cri.go:89] found id: ""
	I1201 19:36:22.700776   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.700802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:22.700815   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:22.700916   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:22.727217   54581 cri.go:89] found id: ""
	I1201 19:36:22.727230   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.727238   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:22.727243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:22.727299   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:22.753365   54581 cri.go:89] found id: ""
	I1201 19:36:22.753379   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.753386   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:22.753392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:22.753459   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:22.779306   54581 cri.go:89] found id: ""
	I1201 19:36:22.779320   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.779327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:22.779336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:22.779394   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:22.804830   54581 cri.go:89] found id: ""
	I1201 19:36:22.804844   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.804860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:22.804866   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:22.804924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:22.831440   54581 cri.go:89] found id: ""
	I1201 19:36:22.831470   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.831478   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:22.831486   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:22.831496   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:22.889394   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:22.889412   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:22.901968   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:22.901983   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:22.974567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:22.974577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:22.974588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:23.043112   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:23.043130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.573279   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:25.584019   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:25.584078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:25.613416   54581 cri.go:89] found id: ""
	I1201 19:36:25.613430   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.613446   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:25.613452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:25.613541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:25.638108   54581 cri.go:89] found id: ""
	I1201 19:36:25.638121   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.638132   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:25.638138   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:25.638198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:25.667581   54581 cri.go:89] found id: ""
	I1201 19:36:25.667596   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.667603   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:25.667608   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:25.667676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:25.695307   54581 cri.go:89] found id: ""
	I1201 19:36:25.695320   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.695328   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:25.695333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:25.695396   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:25.719360   54581 cri.go:89] found id: ""
	I1201 19:36:25.719386   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.719394   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:25.719399   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:25.719466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:25.745097   54581 cri.go:89] found id: ""
	I1201 19:36:25.745120   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.745127   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:25.745133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:25.745207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:25.769545   54581 cri.go:89] found id: ""
	I1201 19:36:25.769558   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.769565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:25.769573   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:25.769584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.799870   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:25.799887   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:25.856015   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:25.856035   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:25.868391   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:25.868407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:25.939423   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:25.939433   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:25.939443   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.503343   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:28.515763   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:28.515836   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:28.541630   54581 cri.go:89] found id: ""
	I1201 19:36:28.541644   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.541652   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:28.541657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:28.541728   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:28.568196   54581 cri.go:89] found id: ""
	I1201 19:36:28.568210   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.568217   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:28.568222   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:28.568280   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:28.593437   54581 cri.go:89] found id: ""
	I1201 19:36:28.593450   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.593457   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:28.593463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:28.593557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:28.619497   54581 cri.go:89] found id: ""
	I1201 19:36:28.619511   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.619518   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:28.619523   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:28.619583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:28.647866   54581 cri.go:89] found id: ""
	I1201 19:36:28.647880   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.647887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:28.647893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:28.647950   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:28.673922   54581 cri.go:89] found id: ""
	I1201 19:36:28.673935   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.673943   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:28.673949   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:28.674021   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:28.698912   54581 cri.go:89] found id: ""
	I1201 19:36:28.698926   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.698933   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:28.698941   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:28.698963   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:28.756082   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:28.756100   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:28.767897   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:28.767913   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:28.836301   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:28.836312   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:28.836330   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.907788   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:28.907807   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.438620   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:31.448979   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:31.449042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:31.475188   54581 cri.go:89] found id: ""
	I1201 19:36:31.475202   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.475209   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:31.475215   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:31.475281   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:31.500385   54581 cri.go:89] found id: ""
	I1201 19:36:31.500398   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.500405   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:31.500411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:31.500468   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:31.525394   54581 cri.go:89] found id: ""
	I1201 19:36:31.525407   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.525414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:31.525419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:31.525481   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:31.550792   54581 cri.go:89] found id: ""
	I1201 19:36:31.550808   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.550815   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:31.550821   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:31.550880   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:31.578076   54581 cri.go:89] found id: ""
	I1201 19:36:31.578090   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.578097   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:31.578102   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:31.578159   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:31.604021   54581 cri.go:89] found id: ""
	I1201 19:36:31.604035   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.604042   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:31.604047   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:31.604108   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:31.633105   54581 cri.go:89] found id: ""
	I1201 19:36:31.633119   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.633126   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:31.633134   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:31.633145   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.663524   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:31.663540   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:31.723171   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:31.723189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:31.734100   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:31.734115   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:31.796567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:31.796577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:31.796588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:34.366168   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:34.376457   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:34.376516   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:34.404948   54581 cri.go:89] found id: ""
	I1201 19:36:34.404977   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.404985   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:34.404991   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:34.405063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:34.431692   54581 cri.go:89] found id: ""
	I1201 19:36:34.431706   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.431713   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:34.431718   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:34.431779   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:34.456671   54581 cri.go:89] found id: ""
	I1201 19:36:34.456685   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.456692   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:34.456697   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:34.456755   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:34.481585   54581 cri.go:89] found id: ""
	I1201 19:36:34.481612   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.481620   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:34.481626   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:34.481696   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:34.506818   54581 cri.go:89] found id: ""
	I1201 19:36:34.506832   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.506839   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:34.506845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:34.506906   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:34.535407   54581 cri.go:89] found id: ""
	I1201 19:36:34.535421   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.535428   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:34.535433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:34.535492   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:34.561311   54581 cri.go:89] found id: ""
	I1201 19:36:34.561324   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.561331   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:34.561339   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:34.561350   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:34.592150   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:34.592167   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:34.648352   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:34.648370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:34.659451   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:34.659467   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:34.728942   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:34.728952   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:34.728962   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:37.291213   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:37.301261   54581 kubeadm.go:602] duration metric: took 4m4.008784532s to restartPrimaryControlPlane
	W1201 19:36:37.301323   54581 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1201 19:36:37.301393   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:36:37.706665   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:36:37.720664   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:36:37.728529   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:36:37.728581   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:36:37.736430   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:36:37.736440   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:36:37.736492   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:36:37.744494   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:36:37.744550   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:36:37.752457   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:36:37.760187   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:36:37.760243   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:36:37.768060   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.775900   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:36:37.775969   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.783655   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:36:37.791670   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:36:37.791723   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:36:37.799641   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:36:37.841794   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:36:37.841853   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:36:37.909907   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:36:37.909969   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:36:37.910004   54581 kubeadm.go:319] OS: Linux
	I1201 19:36:37.910048   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:36:37.910095   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:36:37.910141   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:36:37.910188   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:36:37.910235   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:36:37.910281   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:36:37.910325   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:36:37.910372   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:36:37.910417   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:36:37.982652   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:36:37.982760   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:36:37.982849   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:36:37.989962   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:36:37.995459   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:36:37.995557   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:36:37.995632   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:36:37.995718   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:36:37.995796   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:36:37.995875   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:36:37.995938   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:36:37.996008   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:36:37.996076   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:36:37.996160   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:36:37.996243   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:36:37.996290   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:36:37.996352   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:36:38.264574   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:36:38.510797   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:36:39.269570   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:36:39.443703   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:36:40.036623   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:36:40.036725   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:36:40.042253   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:36:40.045573   54581 out.go:252]   - Booting up control plane ...
	I1201 19:36:40.045681   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:36:40.045758   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:36:40.050263   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:36:40.088031   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:36:40.088133   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:36:40.088246   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:36:40.088332   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:36:40.088370   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:36:40.243689   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:36:40.243803   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:40:40.243834   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000165379s
	I1201 19:40:40.243866   54581 kubeadm.go:319] 
	I1201 19:40:40.243923   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:40:40.243956   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:40:40.244085   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:40:40.244090   54581 kubeadm.go:319] 
	I1201 19:40:40.244193   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:40:40.244226   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:40:40.244256   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:40:40.244260   54581 kubeadm.go:319] 
	I1201 19:40:40.248975   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:40:40.249435   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:40:40.249566   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:40:40.249901   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 19:40:40.249908   54581 kubeadm.go:319] 
	I1201 19:40:40.249980   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1201 19:40:40.250118   54581 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000165379s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1201 19:40:40.250247   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:40:40.662369   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:40:40.675843   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:40:40.675896   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:40:40.683554   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:40:40.683563   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:40:40.683613   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:40:40.691612   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:40:40.691669   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:40:40.699280   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:40:40.706997   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:40:40.707052   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:40:40.714497   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.722891   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:40:40.722949   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.730907   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:40:40.739761   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:40:40.739818   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:40:40.747474   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:40:40.788983   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:40:40.789292   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:40:40.865634   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:40:40.865697   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:40:40.865734   54581 kubeadm.go:319] OS: Linux
	I1201 19:40:40.865777   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:40:40.865824   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:40:40.865869   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:40:40.865916   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:40:40.865963   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:40:40.866013   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:40:40.866057   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:40:40.866104   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:40:40.866149   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:40:40.935875   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:40:40.935986   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:40:40.936084   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:40:40.941886   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:40:40.947334   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:40:40.947424   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:40:40.947488   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:40:40.947568   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:40:40.947628   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:40:40.947696   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:40:40.947749   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:40:40.947810   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:40:40.947870   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:40:40.947944   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:40:40.948014   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:40:40.948051   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:40:40.948105   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:40:41.580020   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:40:42.099824   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:40:42.537556   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:40:42.996026   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:40:43.565704   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:40:43.566397   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:40:43.569105   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:40:43.572244   54581 out.go:252]   - Booting up control plane ...
	I1201 19:40:43.572342   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:40:43.572765   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:40:43.573983   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:40:43.595015   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:40:43.595116   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:40:43.603073   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:40:43.603347   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:40:43.603559   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:40:43.744445   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:40:43.744558   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:44:43.744318   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000287424s
	I1201 19:44:43.744348   54581 kubeadm.go:319] 
	I1201 19:44:43.744432   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:44:43.744486   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:44:43.744623   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:44:43.744628   54581 kubeadm.go:319] 
	I1201 19:44:43.744749   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:44:43.744781   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:44:43.744822   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:44:43.744831   54581 kubeadm.go:319] 
	I1201 19:44:43.748926   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:44:43.749322   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:44:43.749424   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:44:43.749683   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1201 19:44:43.749689   54581 kubeadm.go:319] 
	I1201 19:44:43.749753   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 19:44:43.749803   54581 kubeadm.go:403] duration metric: took 12m10.492478835s to StartCluster
	I1201 19:44:43.749833   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:44:43.749893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:44:43.774966   54581 cri.go:89] found id: ""
	I1201 19:44:43.774979   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.774986   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:44:43.774992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:44:43.775053   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:44:43.800769   54581 cri.go:89] found id: ""
	I1201 19:44:43.800783   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.800790   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:44:43.800796   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:44:43.800854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:44:43.827282   54581 cri.go:89] found id: ""
	I1201 19:44:43.827295   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.827302   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:44:43.827308   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:44:43.827364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:44:43.853930   54581 cri.go:89] found id: ""
	I1201 19:44:43.853944   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.853951   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:44:43.853957   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:44:43.854013   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:44:43.882816   54581 cri.go:89] found id: ""
	I1201 19:44:43.882830   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.882837   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:44:43.882843   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:44:43.882903   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:44:43.909261   54581 cri.go:89] found id: ""
	I1201 19:44:43.909274   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.909281   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:44:43.909287   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:44:43.909344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:44:43.933693   54581 cri.go:89] found id: ""
	I1201 19:44:43.933706   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.933715   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:44:43.933724   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:44:43.933733   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:44:43.990075   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:44:43.990092   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:44:44.001155   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:44:44.001170   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:44:44.070458   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:44:44.070469   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:44:44.070479   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:44:44.136228   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:44:44.136248   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 19:44:44.166389   54581 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 19:44:44.166422   54581 out.go:285] * 
	W1201 19:44:44.166485   54581 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.166502   54581 out.go:285] * 
	W1201 19:44:44.168627   54581 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:44:44.175592   54581 out.go:203] 
	W1201 19:44:44.179124   54581 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.179186   54581 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 19:44:44.179207   54581 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 19:44:44.182569   54581 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667922542Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667937483Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667949232Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667962787Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668042514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668060319Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668082095Z" level=info msg="runtime interface created"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668088528Z" level=info msg="created NRI interface"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668104946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668151788Z" level=info msg="Connect containerd service"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668662446Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.670384243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682522727Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682782323Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682944050Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682830321Z" level=info msg="Start recovering state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708040674Z" level=info msg="Start event monitor"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708239258Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708327772Z" level=info msg="Start streaming server"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708412037Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708599093Z" level=info msg="runtime interface starting up..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708668573Z" level=info msg="starting plugins..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708729215Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708948574Z" level=info msg="containerd successfully booted in 0.060821s"
	Dec 01 19:32:31 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:46:55.976762   23769 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:55.977280   23769 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:55.980374   23769 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:55.981050   23769 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:55.982575   23769 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:46:56 up  1:29,  0 user,  load average: 0.66, 0.32, 0.40
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:46:52 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:53 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 492.
	Dec 01 19:46:53 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:53 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:53 functional-428744 kubelet[23573]: E1201 19:46:53.173776   23573 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:53 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:53 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:53 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 493.
	Dec 01 19:46:53 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:53 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:53 functional-428744 kubelet[23614]: E1201 19:46:53.909515   23614 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:53 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:53 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:54 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 494.
	Dec 01 19:46:54 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:54 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:54 functional-428744 kubelet[23657]: E1201 19:46:54.638621   23657 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:54 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:54 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:55 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 495.
	Dec 01 19:46:55 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:55 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:55 functional-428744 kubelet[23686]: E1201 19:46:55.411659   23686 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:55 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:55 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (367.993215ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd (3.19s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect (2.43s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-428744 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1636: (dbg) Non-zero exit: kubectl --context functional-428744 create deployment hello-node-connect --image kicbase/echo-server: exit status 1 (54.263228ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1638: failed to create hello-node deployment with this command "kubectl --context functional-428744 create deployment hello-node-connect --image kicbase/echo-server": exit status 1.
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-428744 describe po hello-node-connect
functional_test.go:1612: (dbg) Non-zero exit: kubectl --context functional-428744 describe po hello-node-connect: exit status 1 (58.28036ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1614: "kubectl --context functional-428744 describe po hello-node-connect" failed: exit status 1
functional_test.go:1616: hello-node pod describe:
functional_test.go:1618: (dbg) Run:  kubectl --context functional-428744 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-428744 logs -l app=hello-node-connect: exit status 1 (61.01307ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-428744 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-428744 describe svc hello-node-connect
functional_test.go:1624: (dbg) Non-zero exit: kubectl --context functional-428744 describe svc hello-node-connect: exit status 1 (58.234615ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1626: "kubectl --context functional-428744 describe svc hello-node-connect" failed: exit status 1
functional_test.go:1628: hello-node svc describe:
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (337.396636ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                            ARGS                                                                             │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ cache   │ functional-428744 cache reload                                                                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ ssh     │ functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                            │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                         │ minikube          │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │ 01 Dec 25 19:32 UTC │
	│ kubectl │ functional-428744 kubectl -- --context functional-428744 get pods                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ start   │ -p functional-428744 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                    │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:32 UTC │                     │
	│ config  │ functional-428744 config unset cpus                                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ cp      │ functional-428744 cp testdata/cp-test.txt /home/docker/cp-test.txt                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ config  │ functional-428744 config get cpus                                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │                     │
	│ config  │ functional-428744 config set cpus 2                                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ config  │ functional-428744 config get cpus                                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ config  │ functional-428744 config unset cpus                                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ config  │ functional-428744 config get cpus                                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │                     │
	│ ssh     │ functional-428744 ssh -n functional-428744 sudo cat /home/docker/cp-test.txt                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ ssh     │ functional-428744 ssh echo hello                                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ cp      │ functional-428744 cp functional-428744:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelCp929166965/001/cp-test.txt │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ ssh     │ functional-428744 ssh cat /etc/hostname                                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ tunnel  │ functional-428744 tunnel --alsologtostderr                                                                                                                  │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │                     │
	│ ssh     │ functional-428744 ssh -n functional-428744 sudo cat /home/docker/cp-test.txt                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ tunnel  │ functional-428744 tunnel --alsologtostderr                                                                                                                  │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │                     │
	│ cp      │ functional-428744 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt                                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ tunnel  │ functional-428744 tunnel --alsologtostderr                                                                                                                  │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │                     │
	│ ssh     │ functional-428744 ssh -n functional-428744 sudo cat /tmp/does/not/exist/cp-test.txt                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:44 UTC │ 01 Dec 25 19:44 UTC │
	│ addons  │ functional-428744 addons list                                                                                                                               │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ addons  │ functional-428744 addons list -o json                                                                                                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:32:28
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:32:28.671063   54581 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:32:28.671177   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671181   54581 out.go:374] Setting ErrFile to fd 2...
	I1201 19:32:28.671185   54581 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:32:28.671462   54581 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:32:28.671791   54581 out.go:368] Setting JSON to false
	I1201 19:32:28.672593   54581 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":4500,"bootTime":1764613049,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:32:28.672645   54581 start.go:143] virtualization:  
	I1201 19:32:28.676118   54581 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:32:28.679062   54581 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:32:28.679153   54581 notify.go:221] Checking for updates...
	I1201 19:32:28.685968   54581 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:32:28.688852   54581 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:32:28.691733   54581 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:32:28.694613   54581 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:32:28.697549   54581 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:32:28.700837   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:28.700934   54581 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:32:28.730800   54581 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:32:28.730894   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.786972   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.776963779 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.787065   54581 docker.go:319] overlay module found
	I1201 19:32:28.789990   54581 out.go:179] * Using the docker driver based on existing profile
	I1201 19:32:28.792702   54581 start.go:309] selected driver: docker
	I1201 19:32:28.792712   54581 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.792814   54581 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:32:28.792926   54581 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:32:28.854079   54581 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-01 19:32:28.841219008 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:32:28.854498   54581 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1201 19:32:28.854520   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:28.854580   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:28.854619   54581 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:28.858061   54581 out.go:179] * Starting "functional-428744" primary control-plane node in "functional-428744" cluster
	I1201 19:32:28.860972   54581 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:32:28.863997   54581 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:32:28.866788   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:28.866980   54581 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:32:28.895611   54581 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 19:32:28.895623   54581 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 19:32:28.922565   54581 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 19:32:29.117617   54581 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 19:32:29.117759   54581 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/config.json ...
	I1201 19:32:29.117789   54581 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117872   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 19:32:29.117882   54581 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 108.863µs
	I1201 19:32:29.117888   54581 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 19:32:29.117898   54581 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117926   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 19:32:29.117930   54581 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 33.443µs
	I1201 19:32:29.117935   54581 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117944   54581 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.117979   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 19:32:29.117983   54581 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 40.647µs
	I1201 19:32:29.117988   54581 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 19:32:29.117998   54581 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118023   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 19:32:29.118035   54581 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 30.974µs
	I1201 19:32:29.118040   54581 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118040   54581 cache.go:243] Successfully downloaded all kic artifacts
	I1201 19:32:29.118048   54581 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118072   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 19:32:29.118066   54581 start.go:360] acquireMachinesLock for functional-428744: {Name:mk3b5a813e1aa5988e2f3f833300a148fed85bf9 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118075   54581 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 28.709µs
	I1201 19:32:29.118080   54581 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 19:32:29.118088   54581 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118102   54581 start.go:364] duration metric: took 25.197µs to acquireMachinesLock for "functional-428744"
	I1201 19:32:29.118113   54581 start.go:96] Skipping create...Using existing machine configuration
	I1201 19:32:29.118114   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 19:32:29.118117   54581 fix.go:54] fixHost starting: 
	I1201 19:32:29.118118   54581 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 31.457µs
	I1201 19:32:29.118122   54581 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 19:32:29.118129   54581 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118152   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 19:32:29.118156   54581 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 27.199µs
	I1201 19:32:29.118160   54581 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 19:32:29.118167   54581 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 19:32:29.118216   54581 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 19:32:29.118220   54581 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 53.562µs
	I1201 19:32:29.118229   54581 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 19:32:29.118236   54581 cache.go:87] Successfully saved all images to host disk.
	I1201 19:32:29.118392   54581 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
	I1201 19:32:29.135509   54581 fix.go:112] recreateIfNeeded on functional-428744: state=Running err=<nil>
	W1201 19:32:29.135543   54581 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 19:32:29.140504   54581 out.go:252] * Updating the running docker "functional-428744" container ...
	I1201 19:32:29.140530   54581 machine.go:94] provisionDockerMachine start ...
	I1201 19:32:29.140609   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.157677   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.157997   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.158004   54581 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 19:32:29.305012   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.305026   54581 ubuntu.go:182] provisioning hostname "functional-428744"
	I1201 19:32:29.305098   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.323134   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.323429   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.323437   54581 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-428744 && echo "functional-428744" | sudo tee /etc/hostname
	I1201 19:32:29.478458   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-428744
	
	I1201 19:32:29.478532   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.497049   54581 main.go:143] libmachine: Using SSH client type: native
	I1201 19:32:29.498161   54581 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1201 19:32:29.498184   54581 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-428744' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-428744/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-428744' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 19:32:29.645663   54581 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 19:32:29.645679   54581 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 19:32:29.645696   54581 ubuntu.go:190] setting up certificates
	I1201 19:32:29.645703   54581 provision.go:84] configureAuth start
	I1201 19:32:29.645772   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:29.663161   54581 provision.go:143] copyHostCerts
	I1201 19:32:29.663227   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 19:32:29.663233   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 19:32:29.663306   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 19:32:29.663413   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 19:32:29.663416   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 19:32:29.663441   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 19:32:29.663488   54581 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 19:32:29.663496   54581 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 19:32:29.663517   54581 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 19:32:29.663560   54581 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.functional-428744 san=[127.0.0.1 192.168.49.2 functional-428744 localhost minikube]
	I1201 19:32:29.922590   54581 provision.go:177] copyRemoteCerts
	I1201 19:32:29.922645   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 19:32:29.922682   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:29.944750   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.066257   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 19:32:30.114189   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1201 19:32:30.139869   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 19:32:30.162018   54581 provision.go:87] duration metric: took 516.289617ms to configureAuth
	I1201 19:32:30.162044   54581 ubuntu.go:206] setting minikube options for container-runtime
	I1201 19:32:30.162294   54581 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:32:30.162301   54581 machine.go:97] duration metric: took 1.021765793s to provisionDockerMachine
	I1201 19:32:30.162308   54581 start.go:293] postStartSetup for "functional-428744" (driver="docker")
	I1201 19:32:30.162319   54581 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 19:32:30.162368   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 19:32:30.162422   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.181979   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.285977   54581 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 19:32:30.289531   54581 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 19:32:30.289549   54581 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 19:32:30.289559   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 19:32:30.289616   54581 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 19:32:30.289694   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 19:32:30.289767   54581 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts -> hosts in /etc/test/nested/copy/4305
	I1201 19:32:30.289821   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4305
	I1201 19:32:30.297763   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:30.315893   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts --> /etc/test/nested/copy/4305/hosts (40 bytes)
	I1201 19:32:30.335096   54581 start.go:296] duration metric: took 172.774471ms for postStartSetup
	I1201 19:32:30.335168   54581 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:32:30.335214   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.355398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.458545   54581 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 19:32:30.463103   54581 fix.go:56] duration metric: took 1.344978374s for fixHost
	I1201 19:32:30.463118   54581 start.go:83] releasing machines lock for "functional-428744", held for 1.345010357s
	I1201 19:32:30.463185   54581 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-428744
	I1201 19:32:30.480039   54581 ssh_runner.go:195] Run: cat /version.json
	I1201 19:32:30.480081   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.480337   54581 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 19:32:30.480395   54581 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
	I1201 19:32:30.499221   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.501398   54581 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
	I1201 19:32:30.601341   54581 ssh_runner.go:195] Run: systemctl --version
	I1201 19:32:30.695138   54581 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 19:32:30.699523   54581 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 19:32:30.699612   54581 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 19:32:30.707379   54581 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 19:32:30.707392   54581 start.go:496] detecting cgroup driver to use...
	I1201 19:32:30.707423   54581 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 19:32:30.707469   54581 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 19:32:30.722782   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 19:32:30.736023   54581 docker.go:218] disabling cri-docker service (if available) ...
	I1201 19:32:30.736084   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 19:32:30.751857   54581 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 19:32:30.765106   54581 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 19:32:30.881005   54581 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 19:32:31.019194   54581 docker.go:234] disabling docker service ...
	I1201 19:32:31.019259   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 19:32:31.037044   54581 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 19:32:31.052926   54581 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 19:32:31.181456   54581 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 19:32:31.340481   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 19:32:31.355001   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 19:32:31.370840   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 19:32:31.380231   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 19:32:31.389693   54581 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 19:32:31.389764   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 19:32:31.399360   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.408437   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 19:32:31.417370   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 19:32:31.426455   54581 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 19:32:31.434636   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 19:32:31.443735   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 19:32:31.453324   54581 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 19:32:31.462516   54581 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 19:32:31.470270   54581 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 19:32:31.478172   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:31.592137   54581 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 19:32:31.712107   54581 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 19:32:31.712186   54581 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 19:32:31.715994   54581 start.go:564] Will wait 60s for crictl version
	I1201 19:32:31.716056   54581 ssh_runner.go:195] Run: which crictl
	I1201 19:32:31.719610   54581 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 19:32:31.745073   54581 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 19:32:31.745152   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.765358   54581 ssh_runner.go:195] Run: containerd --version
	I1201 19:32:31.791628   54581 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 19:32:31.794721   54581 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 19:32:31.811133   54581 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1201 19:32:31.818179   54581 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1201 19:32:31.821064   54581 kubeadm.go:884] updating cluster {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 19:32:31.821193   54581 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 19:32:31.821269   54581 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 19:32:31.856356   54581 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 19:32:31.856368   54581 cache_images.go:86] Images are preloaded, skipping loading
	I1201 19:32:31.856374   54581 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1201 19:32:31.856475   54581 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-428744 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 19:32:31.856536   54581 ssh_runner.go:195] Run: sudo crictl info
	I1201 19:32:31.895308   54581 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1201 19:32:31.895325   54581 cni.go:84] Creating CNI manager for ""
	I1201 19:32:31.895333   54581 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:32:31.895346   54581 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 19:32:31.895366   54581 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-428744 NodeName:functional-428744 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 19:32:31.895478   54581 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-428744"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 19:32:31.895541   54581 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 19:32:31.905339   54581 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 19:32:31.905406   54581 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 19:32:31.913323   54581 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1201 19:32:31.927846   54581 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 19:32:31.940396   54581 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1201 19:32:31.953139   54581 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1201 19:32:31.956806   54581 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 19:32:32.073166   54581 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 19:32:32.587407   54581 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744 for IP: 192.168.49.2
	I1201 19:32:32.587419   54581 certs.go:195] generating shared ca certs ...
	I1201 19:32:32.587436   54581 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 19:32:32.587628   54581 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 19:32:32.587672   54581 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 19:32:32.587679   54581 certs.go:257] generating profile certs ...
	I1201 19:32:32.587796   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.key
	I1201 19:32:32.587858   54581 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key.910e2deb
	I1201 19:32:32.587895   54581 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key
	I1201 19:32:32.588027   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 19:32:32.588060   54581 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 19:32:32.588067   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 19:32:32.588104   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 19:32:32.588128   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 19:32:32.588158   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 19:32:32.588202   54581 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 19:32:32.589935   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 19:32:32.611510   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 19:32:32.631449   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 19:32:32.652864   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 19:32:32.672439   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1201 19:32:32.690857   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 19:32:32.709160   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 19:32:32.727076   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 19:32:32.745055   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 19:32:32.762625   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 19:32:32.780355   54581 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 19:32:32.797626   54581 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 19:32:32.810250   54581 ssh_runner.go:195] Run: openssl version
	I1201 19:32:32.816425   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 19:32:32.825294   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829094   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.829148   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 19:32:32.869893   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 19:32:32.877720   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 19:32:32.886198   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889911   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.889967   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 19:32:32.930479   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 19:32:32.938463   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 19:32:32.946940   54581 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950621   54581 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.950676   54581 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 19:32:32.991499   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 19:32:32.999452   54581 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 19:32:33.003313   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 19:32:33.045305   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 19:32:33.087269   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 19:32:33.128376   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 19:32:33.169796   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 19:32:33.211259   54581 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 19:32:33.257335   54581 kubeadm.go:401] StartCluster: {Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:32:33.257412   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 19:32:33.257501   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.284260   54581 cri.go:89] found id: ""
	I1201 19:32:33.284320   54581 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 19:32:33.292458   54581 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 19:32:33.292468   54581 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 19:32:33.292518   54581 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 19:32:33.300158   54581 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.300668   54581 kubeconfig.go:125] found "functional-428744" server: "https://192.168.49.2:8441"
	I1201 19:32:33.301960   54581 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 19:32:33.310120   54581 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-01 19:17:59.066738599 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-01 19:32:31.946987775 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1201 19:32:33.310138   54581 kubeadm.go:1161] stopping kube-system containers ...
	I1201 19:32:33.310149   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1201 19:32:33.310213   54581 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 19:32:33.338492   54581 cri.go:89] found id: ""
	I1201 19:32:33.338551   54581 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1201 19:32:33.356342   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:32:33.364607   54581 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec  1 19:22 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5636 Dec  1 19:22 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5676 Dec  1 19:22 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5588 Dec  1 19:22 /etc/kubernetes/scheduler.conf
	
	I1201 19:32:33.364669   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:32:33.372608   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:32:33.380647   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.380700   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:32:33.388464   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.397123   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.397189   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:32:33.404816   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:32:33.412562   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 19:32:33.412628   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:32:33.420390   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:32:33.428330   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:33.477124   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.484075   54581 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.006926734s)
	I1201 19:32:34.484135   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.694382   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.769616   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1201 19:32:34.812433   54581 api_server.go:52] waiting for apiserver process to appear ...
	I1201 19:32:34.812505   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.313033   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:35.812993   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.312704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:36.813245   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.313300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:37.812687   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.312636   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:38.813205   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:39.813572   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.312587   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:40.812696   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.313535   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:41.813472   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.312708   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:42.813224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.313067   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:43.813328   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.312678   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:44.813484   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.312731   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:45.812683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.313429   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:46.813026   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.312606   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:47.812689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:48.813689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.313474   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:49.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.312618   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:50.813410   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.313371   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:51.812979   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.312792   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:52.812691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.313042   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:53.813445   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.313212   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:54.812741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.312722   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:55.812580   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.313621   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:56.813459   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.313224   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:57.812880   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.313609   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:58.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.313283   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:32:59.812739   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.313558   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:00.813248   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.313098   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:01.813623   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.313600   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:02.813357   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.312559   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:03.812827   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.312653   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:04.812616   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.313447   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:05.813117   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.312712   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:06.812713   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.314198   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:07.812943   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.313642   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:08.813457   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.313464   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:09.812697   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.312626   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:10.813299   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.313365   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:11.813267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.312931   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:12.812887   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.312894   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:13.813197   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.312689   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:14.812595   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.313557   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:15.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.313428   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:16.813327   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.313520   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:17.812744   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.313564   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:18.812611   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.313634   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:19.813393   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.313426   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:20.812688   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.313372   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:21.812638   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.313360   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:22.812897   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.313015   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:23.813101   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.312709   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:24.812907   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.312644   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:25.812569   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.313009   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:26.813448   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.312851   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:27.813268   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.313602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:28.813463   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.312692   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:29.813538   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.313307   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:30.813008   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.313397   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:31.812682   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.313454   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:32.813423   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.313344   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:33.813145   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.312690   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:34.813369   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:34.813443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:34.847624   54581 cri.go:89] found id: ""
	I1201 19:33:34.847638   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.847645   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:34.847650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:34.847707   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:34.877781   54581 cri.go:89] found id: ""
	I1201 19:33:34.877795   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.877802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:34.877807   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:34.877865   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:34.906556   54581 cri.go:89] found id: ""
	I1201 19:33:34.906569   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.906575   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:34.906581   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:34.906638   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:34.932243   54581 cri.go:89] found id: ""
	I1201 19:33:34.932257   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.932264   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:34.932275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:34.932334   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:34.958307   54581 cri.go:89] found id: ""
	I1201 19:33:34.958320   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.958327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:34.958333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:34.958393   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:34.987839   54581 cri.go:89] found id: ""
	I1201 19:33:34.987852   54581 logs.go:282] 0 containers: []
	W1201 19:33:34.987860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:34.987865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:34.987924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:35.013339   54581 cri.go:89] found id: ""
	I1201 19:33:35.013353   54581 logs.go:282] 0 containers: []
	W1201 19:33:35.013360   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:35.013367   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:35.013377   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:35.024284   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:35.024300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:35.102562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:35.094922   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.095513   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097249   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.097760   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:35.099198   11347 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:35.102584   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:35.102595   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:35.168823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:35.168843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:35.200459   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:35.200475   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:37.759267   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:37.769446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:37.769528   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:37.794441   54581 cri.go:89] found id: ""
	I1201 19:33:37.794454   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.794461   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:37.794467   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:37.794522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:37.825029   54581 cri.go:89] found id: ""
	I1201 19:33:37.825042   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.825049   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:37.825059   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:37.825116   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:37.855847   54581 cri.go:89] found id: ""
	I1201 19:33:37.855860   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.855867   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:37.855872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:37.855932   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:37.892812   54581 cri.go:89] found id: ""
	I1201 19:33:37.892826   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.892833   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:37.892839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:37.892902   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:37.923175   54581 cri.go:89] found id: ""
	I1201 19:33:37.923189   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.923195   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:37.923201   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:37.923260   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:37.956838   54581 cri.go:89] found id: ""
	I1201 19:33:37.956852   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.956858   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:37.956864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:37.956921   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:37.983288   54581 cri.go:89] found id: ""
	I1201 19:33:37.983302   54581 logs.go:282] 0 containers: []
	W1201 19:33:37.983309   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:37.983317   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:37.983328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:38.048803   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:38.040424   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.041279   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.042894   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.043414   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:38.044999   11451 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:38.048828   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:38.048842   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:38.114525   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:38.114549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:38.144040   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:38.144056   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:38.203160   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:38.203178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:40.714632   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:40.724993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:40.725058   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:40.749954   54581 cri.go:89] found id: ""
	I1201 19:33:40.749968   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.749975   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:40.749981   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:40.750040   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:40.775337   54581 cri.go:89] found id: ""
	I1201 19:33:40.775350   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.775357   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:40.775362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:40.775425   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:40.801568   54581 cri.go:89] found id: ""
	I1201 19:33:40.801582   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.801590   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:40.801595   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:40.801663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:40.829766   54581 cri.go:89] found id: ""
	I1201 19:33:40.829779   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.829786   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:40.829791   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:40.829850   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:40.864362   54581 cri.go:89] found id: ""
	I1201 19:33:40.864376   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.864383   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:40.864389   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:40.864447   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:40.893407   54581 cri.go:89] found id: ""
	I1201 19:33:40.893419   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.893427   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:40.893433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:40.893507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:40.919149   54581 cri.go:89] found id: ""
	I1201 19:33:40.919163   54581 logs.go:282] 0 containers: []
	W1201 19:33:40.919172   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:40.919179   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:40.919189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:40.949474   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:40.949572   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:41.005421   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:41.005440   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:41.016259   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:41.016274   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:41.078378   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:41.070966   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.071552   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.072706   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.073305   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:41.075007   11573 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:41.078391   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:41.078401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:43.641960   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:43.652106   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:43.652178   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:43.682005   54581 cri.go:89] found id: ""
	I1201 19:33:43.682018   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.682025   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:43.682030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:43.682087   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:43.707580   54581 cri.go:89] found id: ""
	I1201 19:33:43.707593   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.707600   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:43.707606   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:43.707711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:43.732400   54581 cri.go:89] found id: ""
	I1201 19:33:43.732414   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.732421   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:43.732426   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:43.732483   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:43.758218   54581 cri.go:89] found id: ""
	I1201 19:33:43.758232   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.758239   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:43.758245   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:43.758303   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:43.783139   54581 cri.go:89] found id: ""
	I1201 19:33:43.783152   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.783159   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:43.783164   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:43.783227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:43.813453   54581 cri.go:89] found id: ""
	I1201 19:33:43.813467   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.813474   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:43.813480   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:43.813548   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:43.845612   54581 cri.go:89] found id: ""
	I1201 19:33:43.845625   54581 logs.go:282] 0 containers: []
	W1201 19:33:43.845632   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:43.845639   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:43.845649   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:43.909426   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:43.909445   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:43.920543   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:43.920560   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:43.988764   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:43.979790   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.980843   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.982644   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.983393   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:43.985139   11665 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:43.988776   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:43.988797   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:44.051182   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:44.051208   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:46.583925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:46.594468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:46.594554   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:46.620265   54581 cri.go:89] found id: ""
	I1201 19:33:46.620279   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.620286   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:46.620292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:46.620351   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:46.644633   54581 cri.go:89] found id: ""
	I1201 19:33:46.644652   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.644659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:46.644665   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:46.644721   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:46.669867   54581 cri.go:89] found id: ""
	I1201 19:33:46.669881   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.669888   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:46.669893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:46.669948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:46.694417   54581 cri.go:89] found id: ""
	I1201 19:33:46.694431   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.694438   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:46.694454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:46.694512   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:46.721029   54581 cri.go:89] found id: ""
	I1201 19:33:46.721043   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.721051   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:46.721056   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:46.721114   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:46.747445   54581 cri.go:89] found id: ""
	I1201 19:33:46.747459   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.747466   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:46.747471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:46.747525   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:46.771251   54581 cri.go:89] found id: ""
	I1201 19:33:46.771266   54581 logs.go:282] 0 containers: []
	W1201 19:33:46.771272   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:46.771281   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:46.771290   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:46.829699   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:46.829716   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:46.842077   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:46.842096   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:46.924213   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:46.914235   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.914673   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.917812   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.918605   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:46.920424   11770 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:46.924225   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:46.924235   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:46.990853   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:46.990872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:49.521683   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:49.531974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:49.532042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:49.557473   54581 cri.go:89] found id: ""
	I1201 19:33:49.557514   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.557521   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:49.557527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:49.557640   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:49.583182   54581 cri.go:89] found id: ""
	I1201 19:33:49.583229   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.583237   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:49.583242   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:49.583308   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:49.611533   54581 cri.go:89] found id: ""
	I1201 19:33:49.611546   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.611553   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:49.611559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:49.611615   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:49.637433   54581 cri.go:89] found id: ""
	I1201 19:33:49.637446   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.637460   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:49.637466   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:49.637558   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:49.667274   54581 cri.go:89] found id: ""
	I1201 19:33:49.667287   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.667294   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:49.667299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:49.667358   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:49.696772   54581 cri.go:89] found id: ""
	I1201 19:33:49.696790   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.696797   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:49.696803   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:49.696861   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:49.720607   54581 cri.go:89] found id: ""
	I1201 19:33:49.720621   54581 logs.go:282] 0 containers: []
	W1201 19:33:49.720637   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:49.720645   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:49.720655   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:49.776412   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:49.776431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:49.787417   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:49.787432   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:49.862636   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:49.853807   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.854578   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856262   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.856796   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:49.858497   11868 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:49.862647   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:49.862658   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:49.934395   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:49.934421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:52.463339   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:52.473586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:52.473650   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:52.503529   54581 cri.go:89] found id: ""
	I1201 19:33:52.503542   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.503549   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:52.503555   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:52.503618   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:52.531144   54581 cri.go:89] found id: ""
	I1201 19:33:52.531158   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.531165   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:52.531170   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:52.531228   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:52.556664   54581 cri.go:89] found id: ""
	I1201 19:33:52.556678   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.556685   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:52.556691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:52.556753   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:52.583782   54581 cri.go:89] found id: ""
	I1201 19:33:52.583796   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.583802   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:52.583808   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:52.583866   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:52.608468   54581 cri.go:89] found id: ""
	I1201 19:33:52.608481   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.608488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:52.608494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:52.608553   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:52.632068   54581 cri.go:89] found id: ""
	I1201 19:33:52.632081   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.632088   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:52.632093   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:52.632153   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:52.656905   54581 cri.go:89] found id: ""
	I1201 19:33:52.656919   54581 logs.go:282] 0 containers: []
	W1201 19:33:52.656926   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:52.656934   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:52.656944   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:52.715322   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:52.715340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:52.725941   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:52.725956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:52.787814   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:52.779550   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.780374   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782069   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.782659   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:52.784265   11974 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:52.787824   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:52.787835   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:52.857124   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:52.857146   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:55.384601   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:55.394657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:55.394724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:55.419003   54581 cri.go:89] found id: ""
	I1201 19:33:55.419016   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.419023   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:55.419028   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:55.419093   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:55.444043   54581 cri.go:89] found id: ""
	I1201 19:33:55.444057   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.444064   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:55.444069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:55.444126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:55.469199   54581 cri.go:89] found id: ""
	I1201 19:33:55.469212   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.469219   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:55.469224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:55.469284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:55.494106   54581 cri.go:89] found id: ""
	I1201 19:33:55.494123   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.494130   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:55.494135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:55.494192   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:55.523658   54581 cri.go:89] found id: ""
	I1201 19:33:55.523671   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.523678   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:55.523683   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:55.523742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:55.549084   54581 cri.go:89] found id: ""
	I1201 19:33:55.549097   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.549105   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:55.549110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:55.549171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:55.573973   54581 cri.go:89] found id: ""
	I1201 19:33:55.573986   54581 logs.go:282] 0 containers: []
	W1201 19:33:55.573993   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:55.574001   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:55.574014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:33:55.629601   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:55.629618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:55.640511   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:55.640527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:55.703852   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:55.695898   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.696539   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698266   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.698766   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:55.700262   12082 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:55.703862   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:55.703875   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:55.767135   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:55.767154   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.297608   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:33:58.307660   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:33:58.307729   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:33:58.331936   54581 cri.go:89] found id: ""
	I1201 19:33:58.331948   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.331955   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:33:58.331961   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:33:58.332023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:33:58.356515   54581 cri.go:89] found id: ""
	I1201 19:33:58.356528   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.356535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:33:58.356544   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:33:58.356601   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:33:58.381178   54581 cri.go:89] found id: ""
	I1201 19:33:58.381191   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.381198   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:33:58.381203   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:33:58.381259   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:33:58.405890   54581 cri.go:89] found id: ""
	I1201 19:33:58.405904   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.405911   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:33:58.405916   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:33:58.405971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:33:58.429783   54581 cri.go:89] found id: ""
	I1201 19:33:58.429796   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.429804   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:33:58.429809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:33:58.429875   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:33:58.454357   54581 cri.go:89] found id: ""
	I1201 19:33:58.454370   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.454377   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:33:58.454383   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:33:58.454443   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:33:58.483382   54581 cri.go:89] found id: ""
	I1201 19:33:58.483395   54581 logs.go:282] 0 containers: []
	W1201 19:33:58.483403   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:33:58.483410   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:33:58.483421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:33:58.494465   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:33:58.494480   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:33:58.557097   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:33:58.549236   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.549892   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.551486   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.552099   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:33:58.553766   12183 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:33:58.557108   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:33:58.557119   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:33:58.624200   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:33:58.624219   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:33:58.654678   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:33:58.654694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.213704   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:01.225298   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:01.225360   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:01.251173   54581 cri.go:89] found id: ""
	I1201 19:34:01.251187   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.251194   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:01.251200   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:01.251272   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:01.278884   54581 cri.go:89] found id: ""
	I1201 19:34:01.278897   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.278904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:01.278910   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:01.278967   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:01.305393   54581 cri.go:89] found id: ""
	I1201 19:34:01.305407   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.305414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:01.305419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:01.305522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:01.331958   54581 cri.go:89] found id: ""
	I1201 19:34:01.331971   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.331978   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:01.331983   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:01.332042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:01.357701   54581 cri.go:89] found id: ""
	I1201 19:34:01.357714   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.357721   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:01.357727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:01.357786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:01.384631   54581 cri.go:89] found id: ""
	I1201 19:34:01.384645   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.384662   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:01.384668   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:01.384742   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:01.410554   54581 cri.go:89] found id: ""
	I1201 19:34:01.410567   54581 logs.go:282] 0 containers: []
	W1201 19:34:01.410574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:01.410582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:01.410591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:01.466596   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:01.466614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:01.477827   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:01.477843   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:01.543509   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:01.534664   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.535285   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.536986   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.537816   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:01.539578   12289 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:01.543518   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:01.543529   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:01.606587   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:01.606608   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:04.136300   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:04.146336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:04.146412   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:04.177880   54581 cri.go:89] found id: ""
	I1201 19:34:04.177894   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.177901   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:04.177906   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:04.177971   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:04.203986   54581 cri.go:89] found id: ""
	I1201 19:34:04.203999   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.204006   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:04.204012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:04.204068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:04.228899   54581 cri.go:89] found id: ""
	I1201 19:34:04.228912   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.228920   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:04.228925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:04.228989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:04.254700   54581 cri.go:89] found id: ""
	I1201 19:34:04.254715   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.254722   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:04.254729   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:04.254788   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:04.280370   54581 cri.go:89] found id: ""
	I1201 19:34:04.280383   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.280390   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:04.280396   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:04.280453   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:04.304821   54581 cri.go:89] found id: ""
	I1201 19:34:04.304834   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.304842   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:04.304847   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:04.304910   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:04.331513   54581 cri.go:89] found id: ""
	I1201 19:34:04.331525   54581 logs.go:282] 0 containers: []
	W1201 19:34:04.331533   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:04.331540   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:04.331550   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:04.390353   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:04.390371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:04.403182   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:04.403198   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:04.471239   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:04.463543   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.464228   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.465995   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.466529   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:04.467895   12395 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:04.471261   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:04.471273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:04.534546   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:04.534567   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:07.063925   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:07.074362   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:07.074427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:07.107919   54581 cri.go:89] found id: ""
	I1201 19:34:07.107933   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.107940   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:07.107946   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:07.108003   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:07.137952   54581 cri.go:89] found id: ""
	I1201 19:34:07.137965   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.137973   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:07.137978   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:07.138038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:07.172024   54581 cri.go:89] found id: ""
	I1201 19:34:07.172037   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.172044   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:07.172049   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:07.172107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:07.196732   54581 cri.go:89] found id: ""
	I1201 19:34:07.196745   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.196752   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:07.196759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:07.196814   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:07.221862   54581 cri.go:89] found id: ""
	I1201 19:34:07.221875   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.221882   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:07.221888   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:07.221947   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:07.249751   54581 cri.go:89] found id: ""
	I1201 19:34:07.249765   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.249771   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:07.249777   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:07.249833   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:07.275027   54581 cri.go:89] found id: ""
	I1201 19:34:07.275040   54581 logs.go:282] 0 containers: []
	W1201 19:34:07.275047   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:07.275055   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:07.275065   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:07.330139   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:07.330156   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:07.341431   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:07.341447   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:07.404752   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:07.397508   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.398090   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399307   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.399866   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:07.401364   12498 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:07.404762   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:07.404780   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:07.471227   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:07.471244   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.003255   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:10.013892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:10.013949   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:10.059011   54581 cri.go:89] found id: ""
	I1201 19:34:10.059025   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.059033   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:10.059039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:10.059101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:10.096138   54581 cri.go:89] found id: ""
	I1201 19:34:10.096152   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.096170   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:10.096177   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:10.096282   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:10.138539   54581 cri.go:89] found id: ""
	I1201 19:34:10.138600   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.138612   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:10.138618   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:10.138688   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:10.168476   54581 cri.go:89] found id: ""
	I1201 19:34:10.168490   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.168497   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:10.168502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:10.168580   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:10.194454   54581 cri.go:89] found id: ""
	I1201 19:34:10.194480   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.194487   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:10.194493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:10.194560   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:10.219419   54581 cri.go:89] found id: ""
	I1201 19:34:10.219432   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.219439   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:10.219445   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:10.219507   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:10.244925   54581 cri.go:89] found id: ""
	I1201 19:34:10.244938   54581 logs.go:282] 0 containers: []
	W1201 19:34:10.244945   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:10.244953   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:10.244964   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:10.311653   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:10.302119   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.303145   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.304209   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.305980   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:10.306585   12596 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:10.311663   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:10.311673   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:10.377857   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:10.377877   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:10.407833   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:10.407851   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:10.467737   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:10.467757   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:12.980376   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:12.990779   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:12.990838   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:13.016106   54581 cri.go:89] found id: ""
	I1201 19:34:13.016120   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.016127   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:13.016133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:13.016198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:13.044361   54581 cri.go:89] found id: ""
	I1201 19:34:13.044375   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.044382   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:13.044387   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:13.044444   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:13.069827   54581 cri.go:89] found id: ""
	I1201 19:34:13.069841   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.069849   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:13.069854   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:13.069913   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:13.110851   54581 cri.go:89] found id: ""
	I1201 19:34:13.110864   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.110871   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:13.110876   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:13.110933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:13.141612   54581 cri.go:89] found id: ""
	I1201 19:34:13.141626   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.141633   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:13.141638   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:13.141695   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:13.168579   54581 cri.go:89] found id: ""
	I1201 19:34:13.168592   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.168599   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:13.168604   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:13.168676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:13.194182   54581 cri.go:89] found id: ""
	I1201 19:34:13.194196   54581 logs.go:282] 0 containers: []
	W1201 19:34:13.194204   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:13.194211   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:13.194221   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:13.255821   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:13.255840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:13.267071   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:13.267087   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:13.336403   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:13.328186   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.328752   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.330453   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.331065   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:13.332682   12711 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:13.336424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:13.336434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:13.399839   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:13.399859   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:15.930208   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:15.940605   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:15.940671   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:15.966201   54581 cri.go:89] found id: ""
	I1201 19:34:15.966215   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.966223   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:15.966228   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:15.966291   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:15.996515   54581 cri.go:89] found id: ""
	I1201 19:34:15.996528   54581 logs.go:282] 0 containers: []
	W1201 19:34:15.996535   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:15.996541   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:15.996598   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:16.022535   54581 cri.go:89] found id: ""
	I1201 19:34:16.022550   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.022564   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:16.022569   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:16.022630   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:16.057222   54581 cri.go:89] found id: ""
	I1201 19:34:16.057236   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.057246   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:16.057252   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:16.057313   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:16.087879   54581 cri.go:89] found id: ""
	I1201 19:34:16.087893   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.087900   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:16.087905   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:16.087965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:16.120946   54581 cri.go:89] found id: ""
	I1201 19:34:16.120960   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.120968   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:16.120974   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:16.121035   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:16.154523   54581 cri.go:89] found id: ""
	I1201 19:34:16.154538   54581 logs.go:282] 0 containers: []
	W1201 19:34:16.154544   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:16.154552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:16.154562   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:16.227282   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:16.219541   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.220392   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.221962   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.222407   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:16.223912   12812 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:16.227292   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:16.227303   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:16.291304   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:16.291323   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:16.320283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:16.320299   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:16.379997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:16.380014   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:18.891691   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:18.901502   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:18.901561   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:18.926115   54581 cri.go:89] found id: ""
	I1201 19:34:18.926128   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.926135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:18.926141   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:18.926212   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:18.951977   54581 cri.go:89] found id: ""
	I1201 19:34:18.951991   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.951998   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:18.952003   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:18.952068   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:18.983248   54581 cri.go:89] found id: ""
	I1201 19:34:18.983266   54581 logs.go:282] 0 containers: []
	W1201 19:34:18.983273   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:18.983278   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:18.983342   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:19.010990   54581 cri.go:89] found id: ""
	I1201 19:34:19.011010   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.011018   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:19.011024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:19.011086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:19.036672   54581 cri.go:89] found id: ""
	I1201 19:34:19.036686   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.036693   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:19.036699   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:19.036767   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:19.061847   54581 cri.go:89] found id: ""
	I1201 19:34:19.061861   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.061868   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:19.061873   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:19.061933   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:19.095496   54581 cri.go:89] found id: ""
	I1201 19:34:19.095518   54581 logs.go:282] 0 containers: []
	W1201 19:34:19.095525   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:19.095534   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:19.095544   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:19.160188   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:19.160209   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:19.171389   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:19.171411   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:19.237242   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:19.229376   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.230053   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.231639   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.232292   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:19.233958   12922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:19.237253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:19.237273   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:19.299987   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:19.300005   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:21.834525   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:21.845009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:21.845070   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:21.869831   54581 cri.go:89] found id: ""
	I1201 19:34:21.869848   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.869855   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:21.869863   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:21.869920   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:21.894806   54581 cri.go:89] found id: ""
	I1201 19:34:21.894819   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.894826   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:21.894831   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:21.894888   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:21.919467   54581 cri.go:89] found id: ""
	I1201 19:34:21.919481   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.919489   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:21.919494   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:21.919557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:21.947371   54581 cri.go:89] found id: ""
	I1201 19:34:21.947384   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.947392   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:21.947397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:21.947466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:21.972455   54581 cri.go:89] found id: ""
	I1201 19:34:21.972469   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.972488   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:21.972493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:21.972551   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:21.998955   54581 cri.go:89] found id: ""
	I1201 19:34:21.998969   54581 logs.go:282] 0 containers: []
	W1201 19:34:21.998977   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:21.998982   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:21.999044   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:22.030320   54581 cri.go:89] found id: ""
	I1201 19:34:22.030348   54581 logs.go:282] 0 containers: []
	W1201 19:34:22.030356   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:22.030365   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:22.030378   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:22.091531   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:22.091549   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:22.107258   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:22.107285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:22.185420   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:22.177086   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.177621   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179399   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.179753   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:22.181289   13024 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:22.185431   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:22.185442   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:22.250849   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:22.250866   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:24.779249   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:24.792463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:24.792522   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:24.817350   54581 cri.go:89] found id: ""
	I1201 19:34:24.817364   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.817371   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:24.817377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:24.817434   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:24.842191   54581 cri.go:89] found id: ""
	I1201 19:34:24.842205   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.842218   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:24.842224   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:24.842284   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:24.867478   54581 cri.go:89] found id: ""
	I1201 19:34:24.867492   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.867499   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:24.867505   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:24.867576   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:24.899422   54581 cri.go:89] found id: ""
	I1201 19:34:24.899436   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.899443   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:24.899452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:24.899509   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:24.934866   54581 cri.go:89] found id: ""
	I1201 19:34:24.934880   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.934887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:24.934893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:24.934956   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:24.959270   54581 cri.go:89] found id: ""
	I1201 19:34:24.959284   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.959291   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:24.959297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:24.959362   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:24.984211   54581 cri.go:89] found id: ""
	I1201 19:34:24.984224   54581 logs.go:282] 0 containers: []
	W1201 19:34:24.984231   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:24.984239   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:24.984259   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:25.012471   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:25.012487   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:25.072643   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:25.072660   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:25.083552   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:25.083571   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:25.160495   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:25.152596   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.153060   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.154476   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.155308   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:25.157038   13141 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:25.160504   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:25.160516   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:27.727176   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:27.737246   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:27.737307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:27.761343   54581 cri.go:89] found id: ""
	I1201 19:34:27.761357   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.761364   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:27.761370   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:27.761428   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:27.786257   54581 cri.go:89] found id: ""
	I1201 19:34:27.786276   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.786283   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:27.786288   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:27.786344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:27.810779   54581 cri.go:89] found id: ""
	I1201 19:34:27.810798   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.810807   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:27.810812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:27.810874   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:27.834773   54581 cri.go:89] found id: ""
	I1201 19:34:27.834792   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.834799   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:27.834804   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:27.834860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:27.862223   54581 cri.go:89] found id: ""
	I1201 19:34:27.862241   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.862248   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:27.862253   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:27.862307   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:27.887279   54581 cri.go:89] found id: ""
	I1201 19:34:27.887292   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.887299   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:27.887305   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:27.887361   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:27.910821   54581 cri.go:89] found id: ""
	I1201 19:34:27.910834   54581 logs.go:282] 0 containers: []
	W1201 19:34:27.910842   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:27.910849   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:27.910872   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:27.920894   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:27.920909   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:27.982787   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:27.975101   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.975853   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977512   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.977821   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:27.979278   13228 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:27.982797   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:27.982808   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:28.049448   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:28.049466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:28.083298   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:28.083315   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:30.648755   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:30.659054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:30.659115   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:30.683776   54581 cri.go:89] found id: ""
	I1201 19:34:30.683790   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.683797   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:30.683802   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:30.683858   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:30.708715   54581 cri.go:89] found id: ""
	I1201 19:34:30.708729   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.708736   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:30.708741   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:30.708801   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:30.732741   54581 cri.go:89] found id: ""
	I1201 19:34:30.732754   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.732761   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:30.732767   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:30.732821   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:30.762264   54581 cri.go:89] found id: ""
	I1201 19:34:30.762278   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.762284   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:30.762290   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:30.762353   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:30.789298   54581 cri.go:89] found id: ""
	I1201 19:34:30.789312   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.789319   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:30.789324   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:30.789381   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:30.814068   54581 cri.go:89] found id: ""
	I1201 19:34:30.814081   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.814089   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:30.814095   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:30.814157   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:30.841381   54581 cri.go:89] found id: ""
	I1201 19:34:30.841394   54581 logs.go:282] 0 containers: []
	W1201 19:34:30.841402   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:30.841409   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:30.841431   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:30.902920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:30.895184   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.895957   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.897587   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.898066   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:30.899742   13328 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:30.902931   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:30.902943   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:30.965009   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:30.965026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:30.993347   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:30.993370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:31.049258   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:31.049275   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.560996   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:33.571497   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:33.571557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:33.596875   54581 cri.go:89] found id: ""
	I1201 19:34:33.596889   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.596896   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:33.596901   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:33.596960   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:33.623640   54581 cri.go:89] found id: ""
	I1201 19:34:33.623653   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.623659   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:33.623664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:33.623725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:33.647792   54581 cri.go:89] found id: ""
	I1201 19:34:33.647806   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.647814   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:33.647819   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:33.647882   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:33.672114   54581 cri.go:89] found id: ""
	I1201 19:34:33.672127   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.672134   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:33.672139   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:33.672197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:33.704799   54581 cri.go:89] found id: ""
	I1201 19:34:33.704812   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.704820   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:33.704825   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:33.704885   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:33.728981   54581 cri.go:89] found id: ""
	I1201 19:34:33.728995   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.729001   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:33.729006   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:33.729063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:33.756005   54581 cri.go:89] found id: ""
	I1201 19:34:33.756019   54581 logs.go:282] 0 containers: []
	W1201 19:34:33.756027   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:33.756035   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:33.756046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:33.788420   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:33.788437   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:33.848036   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:33.848054   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:33.858909   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:33.858925   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:33.921156   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:33.913444   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.914124   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.915782   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.916126   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:33.917725   13448 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:33.921167   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:33.921178   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.484434   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:36.494616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:36.494679   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:36.520017   54581 cri.go:89] found id: ""
	I1201 19:34:36.520031   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.520038   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:36.520044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:36.520100   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:36.545876   54581 cri.go:89] found id: ""
	I1201 19:34:36.545890   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.545897   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:36.545903   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:36.545966   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:36.571571   54581 cri.go:89] found id: ""
	I1201 19:34:36.571584   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.571591   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:36.571596   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:36.571653   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:36.596997   54581 cri.go:89] found id: ""
	I1201 19:34:36.597012   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.597019   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:36.597024   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:36.597101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:36.623469   54581 cri.go:89] found id: ""
	I1201 19:34:36.623483   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.623491   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:36.623496   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:36.623556   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:36.651811   54581 cri.go:89] found id: ""
	I1201 19:34:36.651824   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.651831   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:36.651837   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:36.651893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:36.676659   54581 cri.go:89] found id: ""
	I1201 19:34:36.676673   54581 logs.go:282] 0 containers: []
	W1201 19:34:36.676680   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:36.676688   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:36.676697   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:36.732392   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:36.732410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:36.743384   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:36.743400   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:36.805329   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:36.797922   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.798318   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.799902   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.800240   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:36.801924   13540 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:36.805338   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:36.805349   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:36.867566   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:36.867584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:39.402157   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:39.412161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:39.412220   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:39.439366   54581 cri.go:89] found id: ""
	I1201 19:34:39.439380   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.439387   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:39.439392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:39.439451   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:39.464076   54581 cri.go:89] found id: ""
	I1201 19:34:39.464090   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.464097   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:39.464108   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:39.464171   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:39.488248   54581 cri.go:89] found id: ""
	I1201 19:34:39.488262   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.488270   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:39.488275   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:39.488331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:39.517302   54581 cri.go:89] found id: ""
	I1201 19:34:39.517315   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.517322   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:39.517328   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:39.517385   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:39.542966   54581 cri.go:89] found id: ""
	I1201 19:34:39.542980   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.542986   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:39.542992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:39.543051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:39.568903   54581 cri.go:89] found id: ""
	I1201 19:34:39.568917   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.568924   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:39.568929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:39.568990   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:39.594057   54581 cri.go:89] found id: ""
	I1201 19:34:39.594069   54581 logs.go:282] 0 containers: []
	W1201 19:34:39.594076   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:39.594084   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:39.594093   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:39.649679   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:39.649698   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:39.660114   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:39.660133   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:39.725472   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:39.717221   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.717686   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.719555   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.720121   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:39.721813   13646 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:39.725500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:39.725512   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:39.793738   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:39.793756   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:42.322742   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:42.333451   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:42.333536   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:42.368118   54581 cri.go:89] found id: ""
	I1201 19:34:42.368132   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.368139   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:42.368146   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:42.368217   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:42.402172   54581 cri.go:89] found id: ""
	I1201 19:34:42.402186   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.402193   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:42.402198   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:42.402266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:42.426759   54581 cri.go:89] found id: ""
	I1201 19:34:42.426772   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.426780   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:42.426785   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:42.426842   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:42.466084   54581 cri.go:89] found id: ""
	I1201 19:34:42.466097   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.466105   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:42.466110   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:42.466168   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:42.490814   54581 cri.go:89] found id: ""
	I1201 19:34:42.490828   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.490835   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:42.490841   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:42.490899   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:42.516557   54581 cri.go:89] found id: ""
	I1201 19:34:42.516570   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.516578   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:42.516583   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:42.516651   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:42.542203   54581 cri.go:89] found id: ""
	I1201 19:34:42.542218   54581 logs.go:282] 0 containers: []
	W1201 19:34:42.542224   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:42.542233   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:42.542243   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:42.599254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:42.599272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:42.610313   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:42.610328   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:42.677502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:42.669453   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.670143   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.671832   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.672381   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:42.674062   13750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:42.677514   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:42.677527   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:42.751656   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:42.751683   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.281764   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:45.295929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:45.296004   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:45.355982   54581 cri.go:89] found id: ""
	I1201 19:34:45.356019   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.356027   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:45.356043   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:45.356214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:45.395974   54581 cri.go:89] found id: ""
	I1201 19:34:45.395987   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.396003   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:45.396008   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:45.396064   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:45.425011   54581 cri.go:89] found id: ""
	I1201 19:34:45.425027   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.425035   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:45.425041   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:45.425175   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:45.450304   54581 cri.go:89] found id: ""
	I1201 19:34:45.450317   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.450325   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:45.450330   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:45.450399   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:45.480282   54581 cri.go:89] found id: ""
	I1201 19:34:45.480296   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.480302   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:45.480307   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:45.480376   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:45.511012   54581 cri.go:89] found id: ""
	I1201 19:34:45.511026   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.511033   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:45.511039   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:45.511101   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:45.536767   54581 cri.go:89] found id: ""
	I1201 19:34:45.536781   54581 logs.go:282] 0 containers: []
	W1201 19:34:45.536797   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:45.536806   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:45.536818   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:45.547801   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:45.547822   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:45.615408   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:45.607606   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.608165   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.609808   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.610340   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:45.611850   13850 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:45.615424   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:45.615434   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:45.679022   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:45.679041   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:45.711030   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:45.711049   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.268349   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:48.279339   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:48.279398   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:48.304817   54581 cri.go:89] found id: ""
	I1201 19:34:48.304831   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.304839   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:48.304844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:48.304905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:48.329897   54581 cri.go:89] found id: ""
	I1201 19:34:48.329911   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.329919   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:48.329924   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:48.329982   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:48.369087   54581 cri.go:89] found id: ""
	I1201 19:34:48.369100   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.369107   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:48.369112   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:48.369169   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:48.400882   54581 cri.go:89] found id: ""
	I1201 19:34:48.400896   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.400903   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:48.400909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:48.400965   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:48.426896   54581 cri.go:89] found id: ""
	I1201 19:34:48.426912   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.426920   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:48.426925   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:48.426987   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:48.455956   54581 cri.go:89] found id: ""
	I1201 19:34:48.455969   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.455987   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:48.455994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:48.456051   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:48.480640   54581 cri.go:89] found id: ""
	I1201 19:34:48.480653   54581 logs.go:282] 0 containers: []
	W1201 19:34:48.480671   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:48.480679   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:48.480690   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:48.536591   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:48.536609   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:48.547466   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:48.547482   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:48.620325   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:48.612629   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.613458   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615094   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.615421   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:48.616981   13957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:48.620335   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:48.620345   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:48.683388   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:48.683407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:51.214144   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:51.224292   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:51.224364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:51.247923   54581 cri.go:89] found id: ""
	I1201 19:34:51.247937   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.247945   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:51.247952   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:51.248011   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:51.273984   54581 cri.go:89] found id: ""
	I1201 19:34:51.273998   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.274005   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:51.274011   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:51.274072   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:51.298775   54581 cri.go:89] found id: ""
	I1201 19:34:51.298789   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.298796   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:51.298801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:51.298860   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:51.326553   54581 cri.go:89] found id: ""
	I1201 19:34:51.326567   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.326574   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:51.326580   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:51.326639   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:51.360945   54581 cri.go:89] found id: ""
	I1201 19:34:51.360959   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.360987   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:51.360992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:51.361059   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:51.396255   54581 cri.go:89] found id: ""
	I1201 19:34:51.396282   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.396290   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:51.396296   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:51.396369   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:51.427687   54581 cri.go:89] found id: ""
	I1201 19:34:51.427700   54581 logs.go:282] 0 containers: []
	W1201 19:34:51.427707   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:51.427715   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:51.427734   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:51.483915   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:51.483934   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:51.495247   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:51.495271   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:51.559547   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:51.551369   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.552102   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.553914   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.554558   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:51.556146   14062 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:51.559558   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:51.559568   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:51.623141   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:51.623161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:54.157001   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:54.170439   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:54.170498   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:54.203772   54581 cri.go:89] found id: ""
	I1201 19:34:54.203785   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.203792   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:54.203798   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:54.203854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:54.231733   54581 cri.go:89] found id: ""
	I1201 19:34:54.231747   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.231754   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:54.231759   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:54.231817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:54.256716   54581 cri.go:89] found id: ""
	I1201 19:34:54.256739   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.256746   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:54.256752   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:54.256817   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:54.281376   54581 cri.go:89] found id: ""
	I1201 19:34:54.281390   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.281407   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:54.281413   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:54.281469   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:54.305969   54581 cri.go:89] found id: ""
	I1201 19:34:54.305982   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.305989   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:54.305994   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:54.306049   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:54.330385   54581 cri.go:89] found id: ""
	I1201 19:34:54.330399   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.330406   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:54.330422   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:54.330478   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:54.358455   54581 cri.go:89] found id: ""
	I1201 19:34:54.358478   54581 logs.go:282] 0 containers: []
	W1201 19:34:54.358489   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:54.358497   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:54.358508   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:54.422783   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:54.422804   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:54.434139   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:54.434153   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:54.499665   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:54.491735   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.492640   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494207   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.494711   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:54.496269   14167 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:54.499677   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:54.499689   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:54.562594   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:54.562614   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:34:57.093944   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:34:57.104140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:34:57.104207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:34:57.129578   54581 cri.go:89] found id: ""
	I1201 19:34:57.129590   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.129597   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:34:57.129603   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:34:57.129663   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:34:57.153119   54581 cri.go:89] found id: ""
	I1201 19:34:57.153133   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.153140   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:34:57.153145   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:34:57.153202   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:34:57.178134   54581 cri.go:89] found id: ""
	I1201 19:34:57.178148   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.178155   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:34:57.178161   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:34:57.178222   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:34:57.208559   54581 cri.go:89] found id: ""
	I1201 19:34:57.208572   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.208579   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:34:57.208585   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:34:57.208642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:34:57.232807   54581 cri.go:89] found id: ""
	I1201 19:34:57.232821   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.232838   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:34:57.232844   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:34:57.232898   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:34:57.257939   54581 cri.go:89] found id: ""
	I1201 19:34:57.257952   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.257959   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:34:57.257964   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:34:57.258022   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:34:57.283855   54581 cri.go:89] found id: ""
	I1201 19:34:57.283869   54581 logs.go:282] 0 containers: []
	W1201 19:34:57.283875   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:34:57.283883   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:34:57.283893   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:34:57.340764   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:34:57.340781   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:34:57.352935   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:34:57.352949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:34:57.427562   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:34:57.420624   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.421152   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422231   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.422530   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:34:57.424104   14271 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:34:57.427571   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:34:57.427581   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:34:57.490526   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:34:57.490553   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.020694   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:00.036199   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:00.036266   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:00.146207   54581 cri.go:89] found id: ""
	I1201 19:35:00.146226   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.146234   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:00.146241   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:00.146319   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:00.271439   54581 cri.go:89] found id: ""
	I1201 19:35:00.271454   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.271462   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:00.271468   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:00.271541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:00.365096   54581 cri.go:89] found id: ""
	I1201 19:35:00.365111   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.365119   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:00.365124   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:00.365190   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:00.419095   54581 cri.go:89] found id: ""
	I1201 19:35:00.419109   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.419116   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:00.419123   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:00.419184   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:00.457455   54581 cri.go:89] found id: ""
	I1201 19:35:00.457470   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.457478   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:00.457507   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:00.457577   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:00.503679   54581 cri.go:89] found id: ""
	I1201 19:35:00.503694   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.503701   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:00.503710   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:00.503803   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:00.546120   54581 cri.go:89] found id: ""
	I1201 19:35:00.546135   54581 logs.go:282] 0 containers: []
	W1201 19:35:00.546142   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:00.546151   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:00.546164   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:00.559836   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:00.559853   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:00.634650   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:00.624097   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.625795   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.626835   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.628840   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:00.629152   14374 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:00.634660   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:00.634675   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:00.700259   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:00.700278   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:00.733345   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:00.733363   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.295407   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:03.305664   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:03.305725   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:03.330370   54581 cri.go:89] found id: ""
	I1201 19:35:03.330385   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.330392   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:03.330397   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:03.330452   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:03.356109   54581 cri.go:89] found id: ""
	I1201 19:35:03.356123   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.356130   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:03.356135   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:03.356198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:03.382338   54581 cri.go:89] found id: ""
	I1201 19:35:03.382352   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.382360   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:03.382366   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:03.382423   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:03.414550   54581 cri.go:89] found id: ""
	I1201 19:35:03.414564   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.414571   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:03.414577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:03.414633   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:03.438540   54581 cri.go:89] found id: ""
	I1201 19:35:03.438553   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.438560   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:03.438565   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:03.438623   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:03.463113   54581 cri.go:89] found id: ""
	I1201 19:35:03.463127   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.463134   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:03.463140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:03.463204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:03.487632   54581 cri.go:89] found id: ""
	I1201 19:35:03.487645   54581 logs.go:282] 0 containers: []
	W1201 19:35:03.487653   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:03.487660   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:03.487670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:03.544515   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:03.544536   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:03.555787   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:03.555803   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:03.627256   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:03.618493   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.619438   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.620165   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.621861   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:03.622291   14481 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:03.627266   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:03.627276   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:03.691235   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:03.691254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.220125   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:06.230749   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:06.230813   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:06.255951   54581 cri.go:89] found id: ""
	I1201 19:35:06.255965   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.255972   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:06.255977   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:06.256034   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:06.281528   54581 cri.go:89] found id: ""
	I1201 19:35:06.281542   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.281549   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:06.281554   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:06.281613   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:06.306502   54581 cri.go:89] found id: ""
	I1201 19:35:06.306515   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.306522   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:06.306527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:06.306590   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:06.337726   54581 cri.go:89] found id: ""
	I1201 19:35:06.337739   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.337745   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:06.337751   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:06.337810   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:06.367682   54581 cri.go:89] found id: ""
	I1201 19:35:06.367696   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.367713   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:06.367726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:06.367793   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:06.397675   54581 cri.go:89] found id: ""
	I1201 19:35:06.397690   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.397707   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:06.397713   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:06.397778   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:06.424426   54581 cri.go:89] found id: ""
	I1201 19:35:06.424439   54581 logs.go:282] 0 containers: []
	W1201 19:35:06.424452   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:06.424460   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:06.424471   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:06.435325   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:06.435340   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:06.499920   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:06.492188   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.492789   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494445   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.494930   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:06.496500   14581 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:06.499942   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:06.499952   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:06.564348   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:06.564367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:06.592906   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:06.592921   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:09.151061   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:09.161179   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:09.161240   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:09.186739   54581 cri.go:89] found id: ""
	I1201 19:35:09.186752   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.186759   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:09.186765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:09.186822   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:09.211245   54581 cri.go:89] found id: ""
	I1201 19:35:09.211259   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.211267   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:09.211273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:09.211336   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:09.239043   54581 cri.go:89] found id: ""
	I1201 19:35:09.239056   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.239063   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:09.239068   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:09.239125   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:09.264055   54581 cri.go:89] found id: ""
	I1201 19:35:09.264068   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.264076   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:09.264081   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:09.264137   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:09.288509   54581 cri.go:89] found id: ""
	I1201 19:35:09.288522   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.288529   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:09.288536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:09.288593   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:09.312763   54581 cri.go:89] found id: ""
	I1201 19:35:09.312777   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.312784   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:09.312789   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:09.312851   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:09.344164   54581 cri.go:89] found id: ""
	I1201 19:35:09.344177   54581 logs.go:282] 0 containers: []
	W1201 19:35:09.344184   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:09.344192   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:09.344203   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:09.356120   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:09.356134   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:09.428320   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:09.420284   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.420923   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.422622   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.423259   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:09.424866   14685 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:09.428329   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:09.428339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:09.491282   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:09.491301   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:09.518473   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:09.518488   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.081815   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:12.092336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:12.092400   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:12.117269   54581 cri.go:89] found id: ""
	I1201 19:35:12.117284   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.117291   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:12.117297   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:12.117355   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:12.141885   54581 cri.go:89] found id: ""
	I1201 19:35:12.141898   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.141904   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:12.141909   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:12.141968   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:12.166386   54581 cri.go:89] found id: ""
	I1201 19:35:12.166400   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.166407   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:12.166411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:12.166479   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:12.190615   54581 cri.go:89] found id: ""
	I1201 19:35:12.190628   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.190636   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:12.190641   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:12.190701   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:12.219887   54581 cri.go:89] found id: ""
	I1201 19:35:12.219900   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.219907   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:12.219912   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:12.219970   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:12.244718   54581 cri.go:89] found id: ""
	I1201 19:35:12.244731   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.244738   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:12.244743   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:12.244802   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:12.272273   54581 cri.go:89] found id: ""
	I1201 19:35:12.272287   54581 logs.go:282] 0 containers: []
	W1201 19:35:12.272294   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:12.272301   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:12.272312   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:12.329315   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:12.329334   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:12.343015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:12.343032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:12.419939   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:12.411319   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.412222   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414132   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.414468   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:12.415971   14794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:12.419949   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:12.419960   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:12.482187   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:12.482205   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:15.011802   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:15.022432   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:15.022499   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:15.094886   54581 cri.go:89] found id: ""
	I1201 19:35:15.094901   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.094909   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:15.094915   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:15.094978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:15.120839   54581 cri.go:89] found id: ""
	I1201 19:35:15.120853   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.120860   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:15.120865   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:15.120927   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:15.150767   54581 cri.go:89] found id: ""
	I1201 19:35:15.150781   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.150795   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:15.150801   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:15.150867   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:15.177630   54581 cri.go:89] found id: ""
	I1201 19:35:15.177644   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.177651   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:15.177656   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:15.177727   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:15.203467   54581 cri.go:89] found id: ""
	I1201 19:35:15.203480   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.203498   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:15.203504   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:15.203563   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:15.229010   54581 cri.go:89] found id: ""
	I1201 19:35:15.229023   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.229031   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:15.229036   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:15.229128   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:15.254029   54581 cri.go:89] found id: ""
	I1201 19:35:15.254043   54581 logs.go:282] 0 containers: []
	W1201 19:35:15.254051   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:15.254058   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:15.254068   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:15.309931   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:15.309949   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:15.320452   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:15.320466   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:15.413158   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:15.405233   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.405928   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.407533   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.408047   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:15.409794   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:15.413169   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:15.413180   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:15.475409   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:15.475428   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:18.004450   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:18.015126   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:18.015185   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:18.046344   54581 cri.go:89] found id: ""
	I1201 19:35:18.046359   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.046366   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:18.046373   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:18.046436   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:18.074519   54581 cri.go:89] found id: ""
	I1201 19:35:18.074532   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.074539   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:18.074545   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:18.074603   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:18.103787   54581 cri.go:89] found id: ""
	I1201 19:35:18.103801   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.103808   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:18.103814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:18.103869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:18.130363   54581 cri.go:89] found id: ""
	I1201 19:35:18.130377   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.130384   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:18.130390   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:18.130449   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:18.155589   54581 cri.go:89] found id: ""
	I1201 19:35:18.155616   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.155625   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:18.155630   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:18.155699   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:18.180628   54581 cri.go:89] found id: ""
	I1201 19:35:18.180641   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.180648   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:18.180654   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:18.180711   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:18.205996   54581 cri.go:89] found id: ""
	I1201 19:35:18.206026   54581 logs.go:282] 0 containers: []
	W1201 19:35:18.206033   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:18.206041   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:18.206051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:18.260718   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:18.260736   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:18.271842   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:18.271858   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:18.342769   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:18.332100   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.332989   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.334523   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.336007   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:18.337221   14995 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:18.342780   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:18.342793   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:18.423726   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:18.423744   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:20.954199   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:20.964087   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:20.964143   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:20.987490   54581 cri.go:89] found id: ""
	I1201 19:35:20.987504   54581 logs.go:282] 0 containers: []
	W1201 19:35:20.987510   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:20.987516   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:20.987572   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:21.012114   54581 cri.go:89] found id: ""
	I1201 19:35:21.012128   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.012135   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:21.012140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:21.012201   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:21.037730   54581 cri.go:89] found id: ""
	I1201 19:35:21.037744   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.037751   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:21.037756   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:21.037815   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:21.062445   54581 cri.go:89] found id: ""
	I1201 19:35:21.062458   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.062465   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:21.062471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:21.062529   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:21.086847   54581 cri.go:89] found id: ""
	I1201 19:35:21.086860   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.086867   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:21.086872   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:21.086930   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:21.111866   54581 cri.go:89] found id: ""
	I1201 19:35:21.111880   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.111886   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:21.111892   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:21.111948   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:21.136296   54581 cri.go:89] found id: ""
	I1201 19:35:21.136311   54581 logs.go:282] 0 containers: []
	W1201 19:35:21.136318   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:21.136326   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:21.136343   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:21.200999   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:21.193700   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.194197   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.195727   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.196080   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:21.197671   15094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:21.201009   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:21.201020   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:21.265838   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:21.265857   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:21.296214   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:21.296230   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:21.354254   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:21.354272   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:23.868647   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:23.879143   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:23.879205   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:23.907613   54581 cri.go:89] found id: ""
	I1201 19:35:23.907633   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.907640   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:23.907645   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:23.907705   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:23.932767   54581 cri.go:89] found id: ""
	I1201 19:35:23.932781   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.932787   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:23.932793   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:23.932849   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:23.961305   54581 cri.go:89] found id: ""
	I1201 19:35:23.961319   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.961326   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:23.961331   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:23.961387   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:23.986651   54581 cri.go:89] found id: ""
	I1201 19:35:23.986664   54581 logs.go:282] 0 containers: []
	W1201 19:35:23.986670   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:23.986676   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:23.986734   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:24.011204   54581 cri.go:89] found id: ""
	I1201 19:35:24.011218   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.011225   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:24.011230   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:24.011286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:24.040784   54581 cri.go:89] found id: ""
	I1201 19:35:24.040798   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.040806   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:24.040812   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:24.040871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:24.067432   54581 cri.go:89] found id: ""
	I1201 19:35:24.067446   54581 logs.go:282] 0 containers: []
	W1201 19:35:24.067453   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:24.067461   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:24.067472   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:24.132929   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:24.124477   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.125269   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.126064   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.127787   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:24.128526   15196 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:24.132946   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:24.132956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:24.194894   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:24.194912   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:24.225351   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:24.225366   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:24.282142   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:24.282161   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:26.793143   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:26.803454   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:26.803518   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:26.827433   54581 cri.go:89] found id: ""
	I1201 19:35:26.827447   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.827454   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:26.827459   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:26.827514   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:26.851666   54581 cri.go:89] found id: ""
	I1201 19:35:26.851680   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.851686   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:26.851691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:26.851749   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:26.880353   54581 cri.go:89] found id: ""
	I1201 19:35:26.880367   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.880374   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:26.880379   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:26.880437   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:26.908944   54581 cri.go:89] found id: ""
	I1201 19:35:26.908957   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.908964   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:26.908969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:26.909025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:26.933983   54581 cri.go:89] found id: ""
	I1201 19:35:26.933996   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.934003   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:26.934009   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:26.934069   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:26.958791   54581 cri.go:89] found id: ""
	I1201 19:35:26.958805   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.958812   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:26.958818   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:26.958878   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:26.983156   54581 cri.go:89] found id: ""
	I1201 19:35:26.983170   54581 logs.go:282] 0 containers: []
	W1201 19:35:26.983177   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:26.983185   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:26.983200   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:27.038997   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:27.039015   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:27.050299   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:27.050314   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:27.113733   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:27.106446   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.106878   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108358   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.108698   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:27.110157   15308 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:27.113744   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:27.113754   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:27.176267   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:27.176285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.706128   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:29.716285   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:29.716344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:29.741420   54581 cri.go:89] found id: ""
	I1201 19:35:29.741435   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.741442   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:29.741447   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:29.741545   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:29.766524   54581 cri.go:89] found id: ""
	I1201 19:35:29.766538   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.766545   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:29.766550   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:29.766616   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:29.795421   54581 cri.go:89] found id: ""
	I1201 19:35:29.795434   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.795441   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:29.795446   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:29.795511   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:29.821121   54581 cri.go:89] found id: ""
	I1201 19:35:29.821135   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.821142   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:29.821147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:29.821204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:29.849641   54581 cri.go:89] found id: ""
	I1201 19:35:29.849654   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.849662   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:29.849667   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:29.849724   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:29.874049   54581 cri.go:89] found id: ""
	I1201 19:35:29.874063   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.874069   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:29.874075   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:29.874136   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:29.897867   54581 cri.go:89] found id: ""
	I1201 19:35:29.897880   54581 logs.go:282] 0 containers: []
	W1201 19:35:29.897887   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:29.897895   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:29.897905   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:29.959029   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:29.959046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:29.991283   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:29.991298   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:30.051265   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:30.051286   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:30.082322   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:30.082339   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:30.173300   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:30.163817   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.164718   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.165788   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.167572   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:30.168284   15426 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:32.673672   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:32.683965   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:32.684023   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:32.712191   54581 cri.go:89] found id: ""
	I1201 19:35:32.712204   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.712211   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:32.712216   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:32.712275   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:32.739246   54581 cri.go:89] found id: ""
	I1201 19:35:32.739259   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.739266   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:32.739272   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:32.739331   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:32.763898   54581 cri.go:89] found id: ""
	I1201 19:35:32.763911   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.763924   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:32.763929   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:32.763989   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:32.789967   54581 cri.go:89] found id: ""
	I1201 19:35:32.789990   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.789997   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:32.790004   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:32.790063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:32.816013   54581 cri.go:89] found id: ""
	I1201 19:35:32.816028   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.816035   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:32.816040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:32.816098   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:32.839560   54581 cri.go:89] found id: ""
	I1201 19:35:32.839573   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.839580   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:32.839586   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:32.839644   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:32.868062   54581 cri.go:89] found id: ""
	I1201 19:35:32.868075   54581 logs.go:282] 0 containers: []
	W1201 19:35:32.868082   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:32.868090   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:32.868099   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:32.923266   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:32.923285   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:32.934015   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:32.934030   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:33.005502   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:32.997033   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.997929   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:32.999760   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.000145   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:33.001895   15521 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:33.005512   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:33.005523   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:33.075965   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:33.075984   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.605628   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:35.617054   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:35.617126   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:35.646998   54581 cri.go:89] found id: ""
	I1201 19:35:35.647012   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.647019   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:35.647025   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:35.647086   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:35.676130   54581 cri.go:89] found id: ""
	I1201 19:35:35.676143   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.676150   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:35.676155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:35.676211   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:35.700589   54581 cri.go:89] found id: ""
	I1201 19:35:35.700602   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.700609   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:35.700616   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:35.700672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:35.725233   54581 cri.go:89] found id: ""
	I1201 19:35:35.725246   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.725253   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:35.725273   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:35.725343   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:35.750382   54581 cri.go:89] found id: ""
	I1201 19:35:35.750396   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.750403   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:35.750408   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:35.750462   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:35.775219   54581 cri.go:89] found id: ""
	I1201 19:35:35.775235   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.775243   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:35.775248   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:35.775320   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:35.800831   54581 cri.go:89] found id: ""
	I1201 19:35:35.800845   54581 logs.go:282] 0 containers: []
	W1201 19:35:35.800852   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:35.800859   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:35.800870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:35.866740   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:35.858616   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.859347   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861068   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.861726   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:35.863343   15620 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:35.866756   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:35.866767   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:35.931013   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:35.931031   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:35.958721   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:35.958743   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:36.015847   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:36.015863   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.535518   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:38.545931   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:38.545993   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:38.571083   54581 cri.go:89] found id: ""
	I1201 19:35:38.571097   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.571104   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:38.571109   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:38.571170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:38.608738   54581 cri.go:89] found id: ""
	I1201 19:35:38.608752   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.608759   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:38.608765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:38.608820   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:38.635605   54581 cri.go:89] found id: ""
	I1201 19:35:38.635619   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.635626   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:38.635631   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:38.635689   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:38.668134   54581 cri.go:89] found id: ""
	I1201 19:35:38.668147   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.668155   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:38.668172   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:38.668231   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:38.693505   54581 cri.go:89] found id: ""
	I1201 19:35:38.693519   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.693526   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:38.693531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:38.693602   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:38.719017   54581 cri.go:89] found id: ""
	I1201 19:35:38.719031   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.719039   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:38.719044   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:38.719103   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:38.748727   54581 cri.go:89] found id: ""
	I1201 19:35:38.748740   54581 logs.go:282] 0 containers: []
	W1201 19:35:38.748747   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:38.748754   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:38.748765   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:38.778021   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:38.778037   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:38.838504   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:38.838524   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:38.851587   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:38.851603   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:38.919080   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:38.909975   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.911254   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.912341   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.913320   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:38.914352   15741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:38.919115   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:38.919130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.484602   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:41.495239   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:41.495298   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:41.525151   54581 cri.go:89] found id: ""
	I1201 19:35:41.525165   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.525172   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:41.525191   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:41.525256   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:41.551287   54581 cri.go:89] found id: ""
	I1201 19:35:41.551301   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.551309   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:41.551329   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:41.551392   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:41.577108   54581 cri.go:89] found id: ""
	I1201 19:35:41.577124   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.577131   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:41.577136   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:41.577204   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:41.613970   54581 cri.go:89] found id: ""
	I1201 19:35:41.613983   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.613991   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:41.614005   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:41.614063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:41.647948   54581 cri.go:89] found id: ""
	I1201 19:35:41.647961   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.647968   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:41.647973   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:41.648038   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:41.675741   54581 cri.go:89] found id: ""
	I1201 19:35:41.675754   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.675761   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:41.675770   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:41.675827   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:41.701031   54581 cri.go:89] found id: ""
	I1201 19:35:41.701053   54581 logs.go:282] 0 containers: []
	W1201 19:35:41.701061   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:41.701068   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:41.701079   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:41.712066   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:41.712081   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:41.774820   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:41.767074   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.767651   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769208   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.769794   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:41.771321   15832 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:41.774852   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:41.774864   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:41.837237   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:41.837254   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:41.867407   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:41.867423   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.425417   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:44.436694   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:44.436764   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:44.462550   54581 cri.go:89] found id: ""
	I1201 19:35:44.462565   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.462571   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:44.462577   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:44.462634   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:44.490237   54581 cri.go:89] found id: ""
	I1201 19:35:44.490250   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.490257   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:44.490262   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:44.490318   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:44.517417   54581 cri.go:89] found id: ""
	I1201 19:35:44.517431   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.517438   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:44.517443   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:44.517523   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:44.542502   54581 cri.go:89] found id: ""
	I1201 19:35:44.542516   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.542523   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:44.542528   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:44.542588   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:44.568636   54581 cri.go:89] found id: ""
	I1201 19:35:44.568650   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.568682   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:44.568688   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:44.568756   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:44.602872   54581 cri.go:89] found id: ""
	I1201 19:35:44.602891   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.602898   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:44.602904   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:44.602961   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:44.633265   54581 cri.go:89] found id: ""
	I1201 19:35:44.633280   54581 logs.go:282] 0 containers: []
	W1201 19:35:44.633287   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:44.633295   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:44.633305   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:44.704029   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:44.695965   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.696791   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698434   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.698915   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:44.700082   15934 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:44.704040   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:44.704051   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:44.768055   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:44.768075   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:44.797083   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:44.797098   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:44.852537   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:44.852555   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.364630   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:47.374921   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:47.374978   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:47.399587   54581 cri.go:89] found id: ""
	I1201 19:35:47.399600   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.399607   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:47.399613   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:47.399672   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:47.426120   54581 cri.go:89] found id: ""
	I1201 19:35:47.426134   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.426141   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:47.426147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:47.426227   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:47.457662   54581 cri.go:89] found id: ""
	I1201 19:35:47.457676   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.457683   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:47.457689   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:47.457747   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:47.482682   54581 cri.go:89] found id: ""
	I1201 19:35:47.482702   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.482709   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:47.482728   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:47.482796   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:47.511319   54581 cri.go:89] found id: ""
	I1201 19:35:47.511334   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.511341   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:47.511346   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:47.511409   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:47.543730   54581 cri.go:89] found id: ""
	I1201 19:35:47.543742   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.543760   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:47.543765   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:47.543831   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:47.572333   54581 cri.go:89] found id: ""
	I1201 19:35:47.572347   54581 logs.go:282] 0 containers: []
	W1201 19:35:47.572355   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:47.572363   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:47.572385   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:47.637165   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:47.637184   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:47.648940   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:47.648956   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:47.711651   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:47.704333   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.704738   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706241   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.706574   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:47.708054   16047 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:47.711662   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:47.711681   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:47.773144   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:47.773163   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:50.303086   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:50.313234   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:50.313293   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:50.337483   54581 cri.go:89] found id: ""
	I1201 19:35:50.337515   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.337522   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:50.337527   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:50.337583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:50.363911   54581 cri.go:89] found id: ""
	I1201 19:35:50.363927   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.363934   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:50.363939   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:50.363994   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:50.388359   54581 cri.go:89] found id: ""
	I1201 19:35:50.388373   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.388380   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:50.388386   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:50.388441   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:50.412983   54581 cri.go:89] found id: ""
	I1201 19:35:50.412996   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.413003   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:50.413014   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:50.413073   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:50.440996   54581 cri.go:89] found id: ""
	I1201 19:35:50.441017   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.441024   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:50.441030   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:50.441085   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:50.467480   54581 cri.go:89] found id: ""
	I1201 19:35:50.467493   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.467501   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:50.467506   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:50.467567   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:50.494388   54581 cri.go:89] found id: ""
	I1201 19:35:50.494402   54581 logs.go:282] 0 containers: []
	W1201 19:35:50.494409   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:50.494416   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:50.494427   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:50.550339   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:50.550359   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:50.561242   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:50.561258   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:50.633849   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:50.625518   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.626220   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.627078   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628200   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:50.628973   16147 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:50.633860   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:50.633870   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:50.702260   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:50.702280   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:53.234959   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:53.245018   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:53.245083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:53.276399   54581 cri.go:89] found id: ""
	I1201 19:35:53.276413   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.276420   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:53.276425   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:53.276491   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:53.305853   54581 cri.go:89] found id: ""
	I1201 19:35:53.305866   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.305873   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:53.305878   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:53.305935   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:53.335241   54581 cri.go:89] found id: ""
	I1201 19:35:53.335255   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.335263   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:53.335269   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:53.335328   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:53.359467   54581 cri.go:89] found id: ""
	I1201 19:35:53.359481   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.359488   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:53.359493   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:53.359550   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:53.384120   54581 cri.go:89] found id: ""
	I1201 19:35:53.384134   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.384141   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:53.384147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:53.384203   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:53.414128   54581 cri.go:89] found id: ""
	I1201 19:35:53.414141   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.414149   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:53.414155   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:53.414214   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:53.439408   54581 cri.go:89] found id: ""
	I1201 19:35:53.439421   54581 logs.go:282] 0 containers: []
	W1201 19:35:53.439428   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:53.439436   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:53.439446   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:53.495007   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:53.495026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:53.505932   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:53.505948   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:53.572678   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:53.564592   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.565379   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567252   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.567720   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:53.569289   16251 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:53.572688   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:53.572702   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:53.650600   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:53.650621   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:56.183319   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:56.193782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:56.193843   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:56.224114   54581 cri.go:89] found id: ""
	I1201 19:35:56.224128   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.224135   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:56.224140   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:56.224197   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:56.254013   54581 cri.go:89] found id: ""
	I1201 19:35:56.254027   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.254034   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:56.254040   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:56.254102   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:56.279886   54581 cri.go:89] found id: ""
	I1201 19:35:56.279900   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.279908   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:56.279914   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:56.279976   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:56.304943   54581 cri.go:89] found id: ""
	I1201 19:35:56.304956   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.304963   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:56.304969   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:56.305025   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:56.328633   54581 cri.go:89] found id: ""
	I1201 19:35:56.328647   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.328654   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:56.328659   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:56.328715   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:56.357255   54581 cri.go:89] found id: ""
	I1201 19:35:56.357269   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.357276   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:56.357281   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:56.357340   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:56.381420   54581 cri.go:89] found id: ""
	I1201 19:35:56.381434   54581 logs.go:282] 0 containers: []
	W1201 19:35:56.381441   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:56.381449   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:56.381459   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:56.439709   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:56.439728   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:35:56.450590   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:56.450605   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:56.516412   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:56.508848   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.509329   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511074   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.511373   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:56.512882   16358 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:56.516423   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:56.516435   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:56.577800   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:56.577828   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.114477   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:35:59.124117   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:35:59.124179   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:35:59.151351   54581 cri.go:89] found id: ""
	I1201 19:35:59.151364   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.151372   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:35:59.151377   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:35:59.151433   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:35:59.179997   54581 cri.go:89] found id: ""
	I1201 19:35:59.180010   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.180017   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:35:59.180022   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:35:59.180084   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:35:59.204818   54581 cri.go:89] found id: ""
	I1201 19:35:59.204832   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.204859   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:35:59.204864   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:35:59.204923   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:35:59.230443   54581 cri.go:89] found id: ""
	I1201 19:35:59.230456   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.230464   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:35:59.230470   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:35:59.230524   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:35:59.254548   54581 cri.go:89] found id: ""
	I1201 19:35:59.254561   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.254569   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:35:59.254574   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:35:59.254629   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:35:59.282564   54581 cri.go:89] found id: ""
	I1201 19:35:59.282577   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.282584   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:35:59.282590   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:35:59.282645   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:35:59.310544   54581 cri.go:89] found id: ""
	I1201 19:35:59.310557   54581 logs.go:282] 0 containers: []
	W1201 19:35:59.310565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:35:59.310573   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:35:59.310587   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:35:59.377012   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:35:59.369344   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.370045   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.371697   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.372091   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:35:59.373763   16460 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:35:59.377021   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:35:59.377032   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:35:59.441479   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:35:59.441511   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:35:59.471908   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:35:59.471924   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:35:59.527613   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:35:59.527631   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.040294   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:02.051787   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:02.051869   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:02.077788   54581 cri.go:89] found id: ""
	I1201 19:36:02.077801   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.077808   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:02.077814   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:02.077871   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:02.103346   54581 cri.go:89] found id: ""
	I1201 19:36:02.103359   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.103366   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:02.103371   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:02.103427   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:02.128949   54581 cri.go:89] found id: ""
	I1201 19:36:02.128963   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.128970   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:02.128975   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:02.129033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:02.153585   54581 cri.go:89] found id: ""
	I1201 19:36:02.153598   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.153605   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:02.153611   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:02.153668   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:02.180499   54581 cri.go:89] found id: ""
	I1201 19:36:02.180513   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.180520   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:02.180531   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:02.180592   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:02.206116   54581 cri.go:89] found id: ""
	I1201 19:36:02.206131   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.206138   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:02.206144   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:02.206210   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:02.232470   54581 cri.go:89] found id: ""
	I1201 19:36:02.232484   54581 logs.go:282] 0 containers: []
	W1201 19:36:02.232492   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:02.232500   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:02.232513   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:02.295347   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:02.295367   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:02.323002   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:02.323018   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:02.382028   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:02.382046   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:02.393159   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:02.393176   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:02.457522   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:02.448910   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.449447   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.451449   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.452072   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:02.453970   16583 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:04.957729   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:04.967951   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:04.968012   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:04.993754   54581 cri.go:89] found id: ""
	I1201 19:36:04.993769   54581 logs.go:282] 0 containers: []
	W1201 19:36:04.993776   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:04.993782   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:04.993844   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:05.019859   54581 cri.go:89] found id: ""
	I1201 19:36:05.019873   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.019881   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:05.019886   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:05.019943   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:05.047016   54581 cri.go:89] found id: ""
	I1201 19:36:05.047031   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.047038   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:05.047046   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:05.047107   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:05.072292   54581 cri.go:89] found id: ""
	I1201 19:36:05.072306   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.072313   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:05.072318   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:05.072377   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:05.099842   54581 cri.go:89] found id: ""
	I1201 19:36:05.099857   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.099864   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:05.099870   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:05.099926   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:05.125552   54581 cri.go:89] found id: ""
	I1201 19:36:05.125566   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.125573   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:05.125579   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:05.125635   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:05.150637   54581 cri.go:89] found id: ""
	I1201 19:36:05.150651   54581 logs.go:282] 0 containers: []
	W1201 19:36:05.150659   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:05.150667   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:05.150677   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:05.218391   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:05.218410   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:05.246651   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:05.246670   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:05.303677   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:05.303694   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:05.314794   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:05.314809   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:05.380077   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:05.371997   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.372682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374411   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.374929   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:05.376682   16688 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:07.881622   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:07.893048   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:07.893109   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:07.918109   54581 cri.go:89] found id: ""
	I1201 19:36:07.918122   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.918129   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:07.918134   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:07.918196   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:07.943504   54581 cri.go:89] found id: ""
	I1201 19:36:07.943518   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.943525   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:07.943536   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:07.943595   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:07.969943   54581 cri.go:89] found id: ""
	I1201 19:36:07.969958   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.969965   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:07.969971   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:07.970033   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:07.994994   54581 cri.go:89] found id: ""
	I1201 19:36:07.995009   54581 logs.go:282] 0 containers: []
	W1201 19:36:07.995015   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:07.995021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:07.995083   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:08.020591   54581 cri.go:89] found id: ""
	I1201 19:36:08.020605   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.020612   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:08.020617   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:08.020676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:08.053041   54581 cri.go:89] found id: ""
	I1201 19:36:08.053056   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.053063   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:08.053069   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:08.053129   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:08.084333   54581 cri.go:89] found id: ""
	I1201 19:36:08.084346   54581 logs.go:282] 0 containers: []
	W1201 19:36:08.084353   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:08.084361   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:08.084371   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:08.099534   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:08.099551   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:08.163985   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:08.155727   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.156308   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158274   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.158945   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:08.160608   16777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:08.163995   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:08.164006   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:08.224823   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:08.224840   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:08.256602   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:08.256618   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:10.818842   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:10.829650   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:10.829713   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:10.866261   54581 cri.go:89] found id: ""
	I1201 19:36:10.866275   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.866293   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:10.866299   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:10.866378   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:10.902129   54581 cri.go:89] found id: ""
	I1201 19:36:10.902157   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.902166   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:10.902171   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:10.902287   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:10.935780   54581 cri.go:89] found id: ""
	I1201 19:36:10.935796   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.935803   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:10.935809   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:10.935868   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:10.961965   54581 cri.go:89] found id: ""
	I1201 19:36:10.961979   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.961987   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:10.961993   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:10.962050   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:10.988752   54581 cri.go:89] found id: ""
	I1201 19:36:10.988765   54581 logs.go:282] 0 containers: []
	W1201 19:36:10.988772   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:10.988778   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:10.988855   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:11.013768   54581 cri.go:89] found id: ""
	I1201 19:36:11.013783   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.013790   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:11.013795   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:11.013852   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:11.039944   54581 cri.go:89] found id: ""
	I1201 19:36:11.039959   54581 logs.go:282] 0 containers: []
	W1201 19:36:11.039982   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:11.039992   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:11.040003   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:11.096281   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:11.096300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:11.107964   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:11.107989   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:11.174240   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:11.165827   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.166729   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168408   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.168788   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:11.170369   16881 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:11.174253   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:11.174265   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:11.240383   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:11.240406   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.770524   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:13.780691   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:13.780754   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:13.805306   54581 cri.go:89] found id: ""
	I1201 19:36:13.805321   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.805328   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:13.805333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:13.805390   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:13.830209   54581 cri.go:89] found id: ""
	I1201 19:36:13.830223   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.830229   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:13.830235   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:13.830294   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:13.859814   54581 cri.go:89] found id: ""
	I1201 19:36:13.859827   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.859834   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:13.859839   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:13.859905   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:13.888545   54581 cri.go:89] found id: ""
	I1201 19:36:13.888559   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.888567   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:13.888573   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:13.888642   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:13.918445   54581 cri.go:89] found id: ""
	I1201 19:36:13.918459   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.918466   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:13.918471   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:13.918530   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:13.944112   54581 cri.go:89] found id: ""
	I1201 19:36:13.944125   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.944132   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:13.944147   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:13.944206   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:13.969842   54581 cri.go:89] found id: ""
	I1201 19:36:13.969856   54581 logs.go:282] 0 containers: []
	W1201 19:36:13.969863   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:13.969872   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:13.969882   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:13.999132   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:13.999150   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:14.056959   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:14.056979   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:14.068288   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:14.068304   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:14.137988   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:14.128502   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.129198   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.131274   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.132362   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:14.133913   16997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:14.137997   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:14.138008   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:16.704768   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:16.715111   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:16.715170   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:16.740051   54581 cri.go:89] found id: ""
	I1201 19:36:16.740065   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.740072   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:16.740078   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:16.740150   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:16.765291   54581 cri.go:89] found id: ""
	I1201 19:36:16.765309   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.765317   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:16.765323   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:16.765380   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:16.790212   54581 cri.go:89] found id: ""
	I1201 19:36:16.790226   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.790233   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:16.790238   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:16.790297   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:16.814700   54581 cri.go:89] found id: ""
	I1201 19:36:16.814714   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.814721   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:16.814726   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:16.814785   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:16.851986   54581 cri.go:89] found id: ""
	I1201 19:36:16.852000   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.852007   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:16.852012   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:16.852067   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:16.883217   54581 cri.go:89] found id: ""
	I1201 19:36:16.883231   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.883237   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:16.883243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:16.883301   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:16.922552   54581 cri.go:89] found id: ""
	I1201 19:36:16.922566   54581 logs.go:282] 0 containers: []
	W1201 19:36:16.922574   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:16.922582   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:16.922591   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:16.982282   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:16.982300   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:16.993387   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:16.993401   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:17.063398   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:17.055109   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.055736   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.057541   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.058088   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:17.059799   17092 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:17.063409   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:17.063421   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:17.125575   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:17.125594   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.654741   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:19.665378   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:19.665445   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:19.690531   54581 cri.go:89] found id: ""
	I1201 19:36:19.690545   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.690553   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:19.690559   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:19.690617   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:19.715409   54581 cri.go:89] found id: ""
	I1201 19:36:19.715423   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.715431   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:19.715436   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:19.715494   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:19.743995   54581 cri.go:89] found id: ""
	I1201 19:36:19.744009   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.744016   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:19.744021   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:19.744078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:19.769191   54581 cri.go:89] found id: ""
	I1201 19:36:19.769204   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.769212   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:19.769217   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:19.769286   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:19.793617   54581 cri.go:89] found id: ""
	I1201 19:36:19.793631   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.793638   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:19.793644   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:19.793704   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:19.818818   54581 cri.go:89] found id: ""
	I1201 19:36:19.818832   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.818840   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:19.818845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:19.818914   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:19.852332   54581 cri.go:89] found id: ""
	I1201 19:36:19.852346   54581 logs.go:282] 0 containers: []
	W1201 19:36:19.852368   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:19.852378   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:19.852389   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:19.884627   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:19.884642   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:19.947006   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:19.947026   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:19.958524   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:19.958539   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:20.040965   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:20.013332   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.014049   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.015824   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.016507   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:20.018079   17215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:20.040976   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:20.040988   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:22.622750   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:22.637572   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:22.637637   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:22.667700   54581 cri.go:89] found id: ""
	I1201 19:36:22.667714   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.667721   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:22.667727   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:22.667786   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:22.700758   54581 cri.go:89] found id: ""
	I1201 19:36:22.700776   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.700802   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:22.700815   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:22.700916   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:22.727217   54581 cri.go:89] found id: ""
	I1201 19:36:22.727230   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.727238   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:22.727243   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:22.727299   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:22.753365   54581 cri.go:89] found id: ""
	I1201 19:36:22.753379   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.753386   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:22.753392   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:22.753459   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:22.779306   54581 cri.go:89] found id: ""
	I1201 19:36:22.779320   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.779327   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:22.779336   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:22.779394   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:22.804830   54581 cri.go:89] found id: ""
	I1201 19:36:22.804844   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.804860   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:22.804866   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:22.804924   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:22.831440   54581 cri.go:89] found id: ""
	I1201 19:36:22.831470   54581 logs.go:282] 0 containers: []
	W1201 19:36:22.831478   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:22.831486   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:22.831496   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:22.889394   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:22.889412   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:22.901968   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:22.901983   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:22.974567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:22.965837   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.966826   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968514   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.968930   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:22.970623   17310 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:22.974577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:22.974588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:23.043112   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:23.043130   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.573279   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:25.584019   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:25.584078   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:25.613416   54581 cri.go:89] found id: ""
	I1201 19:36:25.613430   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.613446   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:25.613452   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:25.613541   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:25.638108   54581 cri.go:89] found id: ""
	I1201 19:36:25.638121   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.638132   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:25.638138   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:25.638198   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:25.667581   54581 cri.go:89] found id: ""
	I1201 19:36:25.667596   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.667603   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:25.667608   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:25.667676   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:25.695307   54581 cri.go:89] found id: ""
	I1201 19:36:25.695320   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.695328   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:25.695333   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:25.695396   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:25.719360   54581 cri.go:89] found id: ""
	I1201 19:36:25.719386   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.719394   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:25.719399   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:25.719466   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:25.745097   54581 cri.go:89] found id: ""
	I1201 19:36:25.745120   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.745127   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:25.745133   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:25.745207   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:25.769545   54581 cri.go:89] found id: ""
	I1201 19:36:25.769558   54581 logs.go:282] 0 containers: []
	W1201 19:36:25.769565   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:25.769573   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:25.769584   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:25.799870   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:25.799887   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:25.856015   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:25.856035   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:25.868391   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:25.868407   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:25.939423   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:25.931657   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.932304   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.933988   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.934305   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:25.935915   17427 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:25.939433   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:25.939443   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.503343   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:28.515763   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:28.515836   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:28.541630   54581 cri.go:89] found id: ""
	I1201 19:36:28.541644   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.541652   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:28.541657   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:28.541728   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:28.568196   54581 cri.go:89] found id: ""
	I1201 19:36:28.568210   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.568217   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:28.568222   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:28.568280   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:28.593437   54581 cri.go:89] found id: ""
	I1201 19:36:28.593450   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.593457   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:28.593463   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:28.593557   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:28.619497   54581 cri.go:89] found id: ""
	I1201 19:36:28.619511   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.619518   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:28.619523   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:28.619583   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:28.647866   54581 cri.go:89] found id: ""
	I1201 19:36:28.647880   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.647887   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:28.647893   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:28.647950   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:28.673922   54581 cri.go:89] found id: ""
	I1201 19:36:28.673935   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.673943   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:28.673949   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:28.674021   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:28.698912   54581 cri.go:89] found id: ""
	I1201 19:36:28.698926   54581 logs.go:282] 0 containers: []
	W1201 19:36:28.698933   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:28.698941   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:28.698963   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:28.756082   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:28.756100   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:28.767897   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:28.767913   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:28.836301   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:28.825893   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.826883   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.828599   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.829181   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:28.830726   17519 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:28.836312   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:28.836330   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:28.907788   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:28.907807   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.438620   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:31.448979   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:31.449042   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:31.475188   54581 cri.go:89] found id: ""
	I1201 19:36:31.475202   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.475209   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:31.475215   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:31.475281   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:31.500385   54581 cri.go:89] found id: ""
	I1201 19:36:31.500398   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.500405   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:31.500411   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:31.500468   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:31.525394   54581 cri.go:89] found id: ""
	I1201 19:36:31.525407   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.525414   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:31.525419   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:31.525481   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:31.550792   54581 cri.go:89] found id: ""
	I1201 19:36:31.550808   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.550815   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:31.550821   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:31.550880   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:31.578076   54581 cri.go:89] found id: ""
	I1201 19:36:31.578090   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.578097   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:31.578102   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:31.578159   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:31.604021   54581 cri.go:89] found id: ""
	I1201 19:36:31.604035   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.604042   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:31.604047   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:31.604108   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:31.633105   54581 cri.go:89] found id: ""
	I1201 19:36:31.633119   54581 logs.go:282] 0 containers: []
	W1201 19:36:31.633126   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:31.633134   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:31.633145   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:31.663524   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:31.663540   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:31.723171   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:31.723189   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:31.734100   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:31.734115   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:31.796567   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:31.788762   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.789575   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791243   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.791739   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:31.793270   17636 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:31.796577   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:31.796588   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:34.366168   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:34.376457   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:36:34.376516   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:36:34.404948   54581 cri.go:89] found id: ""
	I1201 19:36:34.404977   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.404985   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:36:34.404991   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:36:34.405063   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:36:34.431692   54581 cri.go:89] found id: ""
	I1201 19:36:34.431706   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.431713   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:36:34.431718   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:36:34.431779   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:36:34.456671   54581 cri.go:89] found id: ""
	I1201 19:36:34.456685   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.456692   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:36:34.456697   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:36:34.456755   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:36:34.481585   54581 cri.go:89] found id: ""
	I1201 19:36:34.481612   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.481620   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:36:34.481626   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:36:34.481696   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:36:34.506818   54581 cri.go:89] found id: ""
	I1201 19:36:34.506832   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.506839   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:36:34.506845   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:36:34.506906   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:36:34.535407   54581 cri.go:89] found id: ""
	I1201 19:36:34.535421   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.535428   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:36:34.535433   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:36:34.535492   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:36:34.561311   54581 cri.go:89] found id: ""
	I1201 19:36:34.561324   54581 logs.go:282] 0 containers: []
	W1201 19:36:34.561331   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:36:34.561339   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:36:34.561350   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 19:36:34.592150   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:36:34.592167   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:36:34.648352   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:36:34.648370   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:36:34.659451   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:36:34.659467   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:36:34.728942   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:36:34.721152   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.721962   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.723551   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.724022   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:36:34.725635   17740 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:36:34.728952   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:36:34.728962   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:36:37.291213   54581 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:36:37.301261   54581 kubeadm.go:602] duration metric: took 4m4.008784532s to restartPrimaryControlPlane
	W1201 19:36:37.301323   54581 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1201 19:36:37.301393   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:36:37.706665   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:36:37.720664   54581 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 19:36:37.728529   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:36:37.728581   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:36:37.736430   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:36:37.736440   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:36:37.736492   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:36:37.744494   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:36:37.744550   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:36:37.752457   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:36:37.760187   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:36:37.760243   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:36:37.768060   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.775900   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:36:37.775969   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:36:37.783655   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:36:37.791670   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:36:37.791723   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:36:37.799641   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:36:37.841794   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:36:37.841853   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:36:37.909907   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:36:37.909969   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:36:37.910004   54581 kubeadm.go:319] OS: Linux
	I1201 19:36:37.910048   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:36:37.910095   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:36:37.910141   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:36:37.910188   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:36:37.910235   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:36:37.910281   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:36:37.910325   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:36:37.910372   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:36:37.910417   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:36:37.982652   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:36:37.982760   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:36:37.982849   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:36:37.989962   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:36:37.995459   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:36:37.995557   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:36:37.995632   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:36:37.995718   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:36:37.995796   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:36:37.995875   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:36:37.995938   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:36:37.996008   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:36:37.996076   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:36:37.996160   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:36:37.996243   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:36:37.996290   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:36:37.996352   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:36:38.264574   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:36:38.510797   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:36:39.269570   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:36:39.443703   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:36:40.036623   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:36:40.036725   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:36:40.042253   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:36:40.045573   54581 out.go:252]   - Booting up control plane ...
	I1201 19:36:40.045681   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:36:40.045758   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:36:40.050263   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:36:40.088031   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:36:40.088133   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:36:40.088246   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:36:40.088332   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:36:40.088370   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:36:40.243689   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:36:40.243803   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:40:40.243834   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000165379s
	I1201 19:40:40.243866   54581 kubeadm.go:319] 
	I1201 19:40:40.243923   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:40:40.243956   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:40:40.244085   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:40:40.244090   54581 kubeadm.go:319] 
	I1201 19:40:40.244193   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:40:40.244226   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:40:40.244256   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:40:40.244260   54581 kubeadm.go:319] 
	I1201 19:40:40.248975   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:40:40.249435   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:40:40.249566   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:40:40.249901   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 19:40:40.249908   54581 kubeadm.go:319] 
	I1201 19:40:40.249980   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1201 19:40:40.250118   54581 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000165379s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1201 19:40:40.250247   54581 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 19:40:40.662369   54581 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:40:40.675843   54581 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 19:40:40.675896   54581 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 19:40:40.683554   54581 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 19:40:40.683563   54581 kubeadm.go:158] found existing configuration files:
	
	I1201 19:40:40.683613   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1201 19:40:40.691612   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 19:40:40.691669   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 19:40:40.699280   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1201 19:40:40.706997   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 19:40:40.707052   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 19:40:40.714497   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.722891   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 19:40:40.722949   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 19:40:40.730907   54581 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1201 19:40:40.739761   54581 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 19:40:40.739818   54581 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 19:40:40.747474   54581 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 19:40:40.788983   54581 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 19:40:40.789292   54581 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 19:40:40.865634   54581 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 19:40:40.865697   54581 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 19:40:40.865734   54581 kubeadm.go:319] OS: Linux
	I1201 19:40:40.865777   54581 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 19:40:40.865824   54581 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 19:40:40.865869   54581 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 19:40:40.865916   54581 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 19:40:40.865963   54581 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 19:40:40.866013   54581 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 19:40:40.866057   54581 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 19:40:40.866104   54581 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 19:40:40.866149   54581 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 19:40:40.935875   54581 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 19:40:40.935986   54581 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 19:40:40.936084   54581 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 19:40:40.941886   54581 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 19:40:40.947334   54581 out.go:252]   - Generating certificates and keys ...
	I1201 19:40:40.947424   54581 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 19:40:40.947488   54581 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 19:40:40.947568   54581 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 19:40:40.947628   54581 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 19:40:40.947696   54581 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 19:40:40.947749   54581 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 19:40:40.947810   54581 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 19:40:40.947870   54581 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 19:40:40.947944   54581 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 19:40:40.948014   54581 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 19:40:40.948051   54581 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 19:40:40.948105   54581 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 19:40:41.580020   54581 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 19:40:42.099824   54581 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 19:40:42.537556   54581 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 19:40:42.996026   54581 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 19:40:43.565704   54581 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 19:40:43.566397   54581 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 19:40:43.569105   54581 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 19:40:43.572244   54581 out.go:252]   - Booting up control plane ...
	I1201 19:40:43.572342   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 19:40:43.572765   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 19:40:43.573983   54581 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 19:40:43.595015   54581 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 19:40:43.595116   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 19:40:43.603073   54581 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 19:40:43.603347   54581 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 19:40:43.603559   54581 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 19:40:43.744445   54581 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 19:40:43.744558   54581 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 19:44:43.744318   54581 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000287424s
	I1201 19:44:43.744348   54581 kubeadm.go:319] 
	I1201 19:44:43.744432   54581 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 19:44:43.744486   54581 kubeadm.go:319] 	- The kubelet is not running
	I1201 19:44:43.744623   54581 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 19:44:43.744628   54581 kubeadm.go:319] 
	I1201 19:44:43.744749   54581 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 19:44:43.744781   54581 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 19:44:43.744822   54581 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 19:44:43.744831   54581 kubeadm.go:319] 
	I1201 19:44:43.748926   54581 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 19:44:43.749322   54581 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 19:44:43.749424   54581 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 19:44:43.749683   54581 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1201 19:44:43.749689   54581 kubeadm.go:319] 
	I1201 19:44:43.749753   54581 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 19:44:43.749803   54581 kubeadm.go:403] duration metric: took 12m10.492478835s to StartCluster
	I1201 19:44:43.749833   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 19:44:43.749893   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 19:44:43.774966   54581 cri.go:89] found id: ""
	I1201 19:44:43.774979   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.774986   54581 logs.go:284] No container was found matching "kube-apiserver"
	I1201 19:44:43.774992   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 19:44:43.775053   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 19:44:43.800769   54581 cri.go:89] found id: ""
	I1201 19:44:43.800783   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.800790   54581 logs.go:284] No container was found matching "etcd"
	I1201 19:44:43.800796   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 19:44:43.800854   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 19:44:43.827282   54581 cri.go:89] found id: ""
	I1201 19:44:43.827295   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.827302   54581 logs.go:284] No container was found matching "coredns"
	I1201 19:44:43.827308   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 19:44:43.827364   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 19:44:43.853930   54581 cri.go:89] found id: ""
	I1201 19:44:43.853944   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.853951   54581 logs.go:284] No container was found matching "kube-scheduler"
	I1201 19:44:43.853957   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 19:44:43.854013   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 19:44:43.882816   54581 cri.go:89] found id: ""
	I1201 19:44:43.882830   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.882837   54581 logs.go:284] No container was found matching "kube-proxy"
	I1201 19:44:43.882843   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 19:44:43.882903   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 19:44:43.909261   54581 cri.go:89] found id: ""
	I1201 19:44:43.909274   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.909281   54581 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 19:44:43.909287   54581 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 19:44:43.909344   54581 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 19:44:43.933693   54581 cri.go:89] found id: ""
	I1201 19:44:43.933706   54581 logs.go:282] 0 containers: []
	W1201 19:44:43.933715   54581 logs.go:284] No container was found matching "kindnet"
	I1201 19:44:43.933724   54581 logs.go:123] Gathering logs for kubelet ...
	I1201 19:44:43.933733   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 19:44:43.990075   54581 logs.go:123] Gathering logs for dmesg ...
	I1201 19:44:43.990092   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 19:44:44.001155   54581 logs.go:123] Gathering logs for describe nodes ...
	I1201 19:44:44.001170   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 19:44:44.070458   54581 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1201 19:44:44.061396   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.062160   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064033   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.064772   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:44:44.066471   21534 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 19:44:44.070469   54581 logs.go:123] Gathering logs for containerd ...
	I1201 19:44:44.070479   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 19:44:44.136228   54581 logs.go:123] Gathering logs for container status ...
	I1201 19:44:44.136248   54581 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 19:44:44.166389   54581 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 19:44:44.166422   54581 out.go:285] * 
	W1201 19:44:44.166485   54581 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.166502   54581 out.go:285] * 
	W1201 19:44:44.168627   54581 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 19:44:44.175592   54581 out.go:203] 
	W1201 19:44:44.179124   54581 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000287424s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 19:44:44.179186   54581 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 19:44:44.179207   54581 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 19:44:44.182569   54581 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667922542Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667937483Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667949232Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667962787Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668042514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668060319Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668082095Z" level=info msg="runtime interface created"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668088528Z" level=info msg="created NRI interface"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668104946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668151788Z" level=info msg="Connect containerd service"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668662446Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.670384243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682522727Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682782323Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682944050Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682830321Z" level=info msg="Start recovering state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708040674Z" level=info msg="Start event monitor"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708239258Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708327772Z" level=info msg="Start streaming server"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708412037Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708599093Z" level=info msg="runtime interface starting up..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708668573Z" level=info msg="starting plugins..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708729215Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708948574Z" level=info msg="containerd successfully booted in 0.060821s"
	Dec 01 19:32:31 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:46:42.011340   23064 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:42.012244   23064 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:42.014179   23064 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:42.014545   23064 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:46:42.016069   23064 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:46:42 up  1:29,  0 user,  load average: 0.21, 0.21, 0.37
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:46:38 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:39 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 474.
	Dec 01 19:46:39 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:39 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:39 functional-428744 kubelet[22948]: E1201 19:46:39.632413   22948 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:39 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:39 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:40 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 475.
	Dec 01 19:46:40 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:40 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:40 functional-428744 kubelet[22954]: E1201 19:46:40.383850   22954 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:40 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:40 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:41 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 476.
	Dec 01 19:46:41 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:41 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:41 functional-428744 kubelet[22972]: E1201 19:46:41.144092   22972 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:41 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:41 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:41 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 477.
	Dec 01 19:46:41 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:41 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:41 functional-428744 kubelet[23033]: E1201 19:46:41.892480   23033 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:41 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:41 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (360.214771ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect (2.43s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim (241.66s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1201 19:45:02.778014    4305 retry.go:31] will retry after 2.568031512s: Temporary Error: Get "http://10.110.12.91": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1201 19:45:15.347384    4305 retry.go:31] will retry after 3.343414685s: Temporary Error: Get "http://10.110.12.91": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1201 19:45:28.691828    4305 retry.go:31] will retry after 8.794211727s: Temporary Error: Get "http://10.110.12.91": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1201 19:45:47.488008    4305 retry.go:31] will retry after 10.867811845s: Temporary Error: Get "http://10.110.12.91": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1201 19:46:08.356958    4305 retry.go:31] will retry after 21.913089649s: Temporary Error: Get "http://10.110.12.91": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1201 19:47:30.168069    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
functional_test_pvc_test.go:50: ***** TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: pod "integration-test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:50: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
functional_test_pvc_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (301.96381ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
functional_test_pvc_test.go:50: status error: exit status 2 (may be ok)
functional_test_pvc_test.go:50: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
functional_test_pvc_test.go:51: failed waiting for storage-provisioner: integration-test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (337.099594ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-428744 image load --daemon kicbase/echo-server:functional-428744 --alsologtostderr                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls                                                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image save kicbase/echo-server:functional-428744 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image rm kicbase/echo-server:functional-428744 --alsologtostderr                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls                                                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls                                                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image save --daemon kicbase/echo-server:functional-428744 --alsologtostderr                                                                   │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /etc/test/nested/copy/4305/hosts                                                                                                 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /etc/ssl/certs/4305.pem                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /usr/share/ca-certificates/4305.pem                                                                                              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /etc/ssl/certs/43052.pem                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /usr/share/ca-certificates/43052.pem                                                                                             │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls --format short --alsologtostderr                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ update-context │ functional-428744 update-context --alsologtostderr -v=2                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ ssh            │ functional-428744 ssh pgrep buildkitd                                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │                     │
	│ image          │ functional-428744 image build -t localhost/my-image:functional-428744 testdata/build --alsologtostderr                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls                                                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls --format yaml --alsologtostderr                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls --format json --alsologtostderr                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ image          │ functional-428744 image ls --format table --alsologtostderr                                                                                                     │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ update-context │ functional-428744 update-context --alsologtostderr -v=2                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	│ update-context │ functional-428744 update-context --alsologtostderr -v=2                                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:47 UTC │ 01 Dec 25 19:47 UTC │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:46:57
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:46:57.356677   71859 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:46:57.356876   71859 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.356902   71859 out.go:374] Setting ErrFile to fd 2...
	I1201 19:46:57.356926   71859 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.357228   71859 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:46:57.357689   71859 out.go:368] Setting JSON to false
	I1201 19:46:57.358532   71859 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":5369,"bootTime":1764613049,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:46:57.358629   71859 start.go:143] virtualization:  
	I1201 19:46:57.361983   71859 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:46:57.365564   71859 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:46:57.365715   71859 notify.go:221] Checking for updates...
	I1201 19:46:57.371851   71859 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:46:57.374888   71859 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:46:57.377862   71859 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:46:57.380678   71859 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:46:57.383580   71859 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:46:57.386969   71859 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:46:57.387571   71859 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:46:57.417643   71859 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:46:57.417826   71859 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.480404   71859 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.471331211 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.480521   71859 docker.go:319] overlay module found
	I1201 19:46:57.483650   71859 out.go:179] * Using the docker driver based on existing profile
	I1201 19:46:57.486552   71859 start.go:309] selected driver: docker
	I1201 19:46:57.486579   71859 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.486673   71859 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:46:57.486786   71859 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.541825   71859 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.532967583 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.542260   71859 cni.go:84] Creating CNI manager for ""
	I1201 19:46:57.542331   71859 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:46:57.542373   71859 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.547200   71859 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:47:04 functional-428744 containerd[10298]: time="2025-12-01T19:47:04.084996108Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:04 functional-428744 containerd[10298]: time="2025-12-01T19:47:04.085696425Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-428744\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.145376633Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-428744\""
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.148117902Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-428744\""
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.152096947Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.160475088Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-428744\" returns successfully"
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.389104265Z" level=info msg="No images store for sha256:df268673142e41d1756e2aa6f346d4d15112a5aa42ba43781fbf27b6cfcab9e8"
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.391256497Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-428744\""
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.400783707Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:05 functional-428744 containerd[10298]: time="2025-12-01T19:47:05.401476671Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-428744\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.209390476Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-428744\""
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.211815155Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-428744\""
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.214827000Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.223218918Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-428744\" returns successfully"
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.884718040Z" level=info msg="No images store for sha256:78566c0f41a9cb8da1e678aece8a6d4ab7daaa00cca1a424abb7a8a4335589e6"
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.887364079Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-428744\""
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.897179563Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:06 functional-428744 containerd[10298]: time="2025-12-01T19:47:06.897859753Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-428744\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.294501190Z" level=info msg="connecting to shim gy0ef0qj6q04t99im8n6ymwzm" address="unix:///run/containerd/s/114b74fdf76ca1728c6636dd85dd6d1a8a1f057838ceacc6044253b4210dcb8a" namespace=k8s.io protocol=ttrpc version=3
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.387636114Z" level=info msg="shim disconnected" id=gy0ef0qj6q04t99im8n6ymwzm namespace=k8s.io
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.387861987Z" level=info msg="cleaning up after shim disconnected" id=gy0ef0qj6q04t99im8n6ymwzm namespace=k8s.io
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.387958497Z" level=info msg="cleaning up dead shim" id=gy0ef0qj6q04t99im8n6ymwzm namespace=k8s.io
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.659634717Z" level=info msg="ImageCreate event name:\"localhost/my-image:functional-428744\""
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.669800115Z" level=info msg="ImageCreate event name:\"sha256:4080434db4544c9ad723959f1a84a91883dfea2013af4850c1bea2166ef7f4e4\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 01 19:47:13 functional-428744 containerd[10298]: time="2025-12-01T19:47:13.670190208Z" level=info msg="ImageUpdate event name:\"localhost/my-image:functional-428744\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:48:54.532303   25750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:48:54.533483   25750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:48:54.534391   25750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:48:54.535490   25750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:48:54.536109   25750 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:48:54 up  1:31,  0 user,  load average: 0.50, 0.45, 0.44
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:48:51 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:51 functional-428744 kubelet[25617]: E1201 19:48:51.634927   25617 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:48:51 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:48:51 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:48:52 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 651.
	Dec 01 19:48:52 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:52 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:52 functional-428744 kubelet[25623]: E1201 19:48:52.385012   25623 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:48:52 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:48:52 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:48:53 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 652.
	Dec 01 19:48:53 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:53 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:53 functional-428744 kubelet[25629]: E1201 19:48:53.135302   25629 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:48:53 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:48:53 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:48:53 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 653.
	Dec 01 19:48:53 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:53 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:53 functional-428744 kubelet[25662]: E1201 19:48:53.894010   25662 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:48:53 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:48:53 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:48:54 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 654.
	Dec 01 19:48:54 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:48:54 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (315.351163ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim (241.66s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels (1.69s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-428744 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
functional_test.go:234: (dbg) Non-zero exit: kubectl --context functional-428744 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": exit status 1 (65.852805ms)

                                                
                                                
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:236: failed to 'kubectl get nodes' with args "kubectl --context functional-428744 get nodes --output=go-template \"--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'\"": exit status 1
functional_test.go:242: expected to have label "minikube.k8s.io/commit" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/version" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/updated_at" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/name" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/primary" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-428744
helpers_test.go:243: (dbg) docker inspect functional-428744:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	        "Created": "2025-12-01T19:17:42.064970359Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 42803,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T19:17:42.147832287Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hostname",
	        "HostsPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/hosts",
	        "LogPath": "/var/lib/docker/containers/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270/0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270-json.log",
	        "Name": "/functional-428744",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-428744:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-428744",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "0a5f71818186b6efe00b0c4fd703113b9db93449ab67fc975198a29e2a89e270",
	                "LowerDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/merged",
	                "UpperDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/diff",
	                "WorkDir": "/var/lib/docker/overlay2/1a62d13b7fac30f74f1b012c6abe37674e739912606e8fb507d0d12f173758a0/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-428744",
	                "Source": "/var/lib/docker/volumes/functional-428744/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-428744",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-428744",
	                "name.minikube.sigs.k8s.io": "functional-428744",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "51251ff92164671747855c7e0b3049c8a41696f58071f065fdb32c7fdee7e56a",
	            "SandboxKey": "/var/run/docker/netns/51251ff92164",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-428744": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "22:f4:3c:a2:cd:d7",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "32e9eb731fe0a52c62a7b3657fd1dee3e6c43cd7ae203e31dab0af674dff0487",
	                    "EndpointID": "841971828a3ae8760afb7fa3bf2628bc9423d4b0ccde294eae5b28aecb27b14d",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-428744",
	                        "0a5f71818186"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-428744 -n functional-428744: exit status 2 (305.124498ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service   │ functional-428744 service hello-node --url --format={{.IP}}                                                                                         │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ service   │ functional-428744 service hello-node --url                                                                                                          │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1              │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh -- ls -la /mount-9p                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh cat /mount-9p/test-1764618408110613921                                                                                        │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh sudo umount -f /mount-9p                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo894755151/001:/mount-9p --alsologtostderr -v=1 --port 46464  │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh -- ls -la /mount-9p                                                                                                           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh sudo umount -f /mount-9p                                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount1 --alsologtostderr -v=1                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount2 --alsologtostderr -v=1                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount1                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount     │ -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount3 --alsologtostderr -v=1                │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ ssh       │ functional-428744 ssh findmnt -T /mount2                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ ssh       │ functional-428744 ssh findmnt -T /mount3                                                                                                            │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │ 01 Dec 25 19:46 UTC │
	│ mount     │ -p functional-428744 --kill=true                                                                                                                    │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ start     │ -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ start     │ -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ start     │ -p functional-428744 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0           │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-428744 --alsologtostderr -v=1                                                                                      │ functional-428744 │ jenkins │ v1.37.0 │ 01 Dec 25 19:46 UTC │                     │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:46:57
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:46:57.356677   71859 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:46:57.356876   71859 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.356902   71859 out.go:374] Setting ErrFile to fd 2...
	I1201 19:46:57.356926   71859 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.357228   71859 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:46:57.357689   71859 out.go:368] Setting JSON to false
	I1201 19:46:57.358532   71859 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":5369,"bootTime":1764613049,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:46:57.358629   71859 start.go:143] virtualization:  
	I1201 19:46:57.361983   71859 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:46:57.365564   71859 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:46:57.365715   71859 notify.go:221] Checking for updates...
	I1201 19:46:57.371851   71859 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:46:57.374888   71859 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:46:57.377862   71859 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:46:57.380678   71859 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:46:57.383580   71859 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:46:57.386969   71859 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:46:57.387571   71859 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:46:57.417643   71859 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:46:57.417826   71859 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.480404   71859 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.471331211 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.480521   71859 docker.go:319] overlay module found
	I1201 19:46:57.483650   71859 out.go:179] * Using the docker driver based on existing profile
	I1201 19:46:57.486552   71859 start.go:309] selected driver: docker
	I1201 19:46:57.486579   71859 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.486673   71859 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:46:57.486786   71859 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.541825   71859 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.532967583 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.542260   71859 cni.go:84] Creating CNI manager for ""
	I1201 19:46:57.542331   71859 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:46:57.542373   71859 start.go:353] cluster config:
	{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.547200   71859 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667922542Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667937483Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667949232Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.667962787Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668042514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668060319Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668082095Z" level=info msg="runtime interface created"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668088528Z" level=info msg="created NRI interface"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668104946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668151788Z" level=info msg="Connect containerd service"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.668662446Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.670384243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682522727Z" level=info msg="Start subscribing containerd event"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682782323Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682944050Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.682830321Z" level=info msg="Start recovering state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708040674Z" level=info msg="Start event monitor"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708239258Z" level=info msg="Start cni network conf syncer for default"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708327772Z" level=info msg="Start streaming server"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708412037Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708599093Z" level=info msg="runtime interface starting up..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708668573Z" level=info msg="starting plugins..."
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708729215Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 01 19:32:31 functional-428744 containerd[10298]: time="2025-12-01T19:32:31.708948574Z" level=info msg="containerd successfully booted in 0.060821s"
	Dec 01 19:32:31 functional-428744 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1201 19:47:00.537679   24051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:47:00.538715   24051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:47:00.539747   24051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:47:00.540344   24051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1201 19:47:00.541986   24051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:47:00 up  1:29,  0 user,  load average: 1.17, 0.43, 0.43
	Linux functional-428744 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 19:46:57 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:57 functional-428744 kubelet[23810]: E1201 19:46:57.655737   23810 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:57 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:57 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:58 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 499.
	Dec 01 19:46:58 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:58 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:58 functional-428744 kubelet[23840]: E1201 19:46:58.394928   23840 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:58 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:58 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:59 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 500.
	Dec 01 19:46:59 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:59 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:59 functional-428744 kubelet[23938]: E1201 19:46:59.146022   23938 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:59 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:59 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:46:59 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 501.
	Dec 01 19:46:59 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:59 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:46:59 functional-428744 kubelet[23975]: E1201 19:46:59.891905   23975 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 19:46:59 functional-428744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 19:46:59 functional-428744 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 19:47:00 functional-428744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 502.
	Dec 01 19:47:00 functional-428744 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 19:47:00 functional-428744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-428744 -n functional-428744: exit status 2 (317.778167ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-428744" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels (1.69s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel (0.54s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr]
functional_test_tunnel_test.go:190: tunnel command failed with unexpected error: exit code 103. stderr: I1201 19:44:52.126132   67598 out.go:360] Setting OutFile to fd 1 ...
I1201 19:44:52.126366   67598 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:44:52.126383   67598 out.go:374] Setting ErrFile to fd 2...
I1201 19:44:52.126391   67598 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:44:52.126727   67598 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:44:52.127186   67598 mustload.go:66] Loading cluster: functional-428744
I1201 19:44:52.127727   67598 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:44:52.128296   67598 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:44:52.157445   67598 host.go:66] Checking if "functional-428744" exists ...
I1201 19:44:52.157822   67598 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1201 19:44:52.269892   67598 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:44:52.260461399 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1201 19:44:52.270020   67598 api_server.go:166] Checking apiserver status ...
I1201 19:44:52.270075   67598 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1201 19:44:52.270118   67598 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:44:52.325599   67598 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
W1201 19:44:52.470110   67598 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1201 19:44:52.473526   67598 out.go:179] * The control-plane node functional-428744 apiserver is not running: (state=Stopped)
I1201 19:44:52.478288   67598 out.go:179]   To start a cluster, run: "minikube start -p functional-428744"

                                                
                                                
stdout: * The control-plane node functional-428744 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-428744"
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] ...
helpers_test.go:525: unable to kill pid 67599: os: process already finished
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] stderr:
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] stderr:
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel (0.54s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup (0.18s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-428744 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:212: (dbg) Non-zero exit: kubectl --context functional-428744 apply -f testdata/testsvc.yaml: exit status 1 (179.792486ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/testsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:214: kubectl --context functional-428744 apply -f testdata/testsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup (0.18s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect (107.55s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:288: failed to hit nginx at "http://10.110.12.91": Temporary Error: Get "http://10.110.12.91": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-428744 get svc nginx-svc
functional_test_tunnel_test.go:290: (dbg) Non-zero exit: kubectl --context functional-428744 get svc nginx-svc: exit status 1 (57.961135ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:292: kubectl --context functional-428744 get svc nginx-svc failed: exit status 1
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect (107.55s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-428744 create deployment hello-node --image kicbase/echo-server
functional_test.go:1451: (dbg) Non-zero exit: kubectl --context functional-428744 create deployment hello-node --image kicbase/echo-server: exit status 1 (65.978697ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1453: failed to create hello-node deployment with this command "kubectl --context functional-428744 create deployment hello-node --image kicbase/echo-server": exit status 1.
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 service list
functional_test.go:1469: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 service list: exit status 103 (267.605845ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-428744 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-428744"

                                                
                                                
-- /stdout --
functional_test.go:1471: failed to do service list. args "out/minikube-linux-arm64 -p functional-428744 service list" : exit status 103
functional_test.go:1474: expected 'service list' to contain *hello-node* but got -"* The control-plane node functional-428744 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-428744\"\n"-
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 service list -o json
functional_test.go:1499: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 service list -o json: exit status 103 (274.561557ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-428744 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-428744"

                                                
                                                
-- /stdout --
functional_test.go:1501: failed to list services with json format. args "out/minikube-linux-arm64 -p functional-428744 service list -o json": exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 service --namespace=default --https --url hello-node
functional_test.go:1519: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 service --namespace=default --https --url hello-node: exit status 103 (266.978276ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-428744 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-428744"

                                                
                                                
-- /stdout --
functional_test.go:1521: failed to get service url. args "out/minikube-linux-arm64 -p functional-428744 service --namespace=default --https --url hello-node" : exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 service hello-node --url --format={{.IP}}
functional_test.go:1550: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 service hello-node --url --format={{.IP}}: exit status 103 (260.635284ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-428744 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-428744"

                                                
                                                
-- /stdout --
functional_test.go:1552: failed to get service url with custom format. args "out/minikube-linux-arm64 -p functional-428744 service hello-node --url --format={{.IP}}": exit status 103
functional_test.go:1558: "* The control-plane node functional-428744 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-428744\"" is not a valid IP
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 service hello-node --url
functional_test.go:1569: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 service hello-node --url: exit status 103 (268.029034ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-428744 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-428744"

                                                
                                                
-- /stdout --
functional_test.go:1571: failed to get service url. args: "out/minikube-linux-arm64 -p functional-428744 service hello-node --url": exit status 103
functional_test.go:1575: found endpoint for hello-node: * The control-plane node functional-428744 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-428744"
functional_test.go:1579: failed to parse "* The control-plane node functional-428744 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-428744\"": parse "* The control-plane node functional-428744 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-428744\"": net/url: invalid control character in URL
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port (2.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1764618408110613921" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1764618408110613921" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1764618408110613921" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001/test-1764618408110613921
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (345.309572ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1201 19:46:48.456214    4305 retry.go:31] will retry after 273.461844ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec  1 19:46 created-by-test
-rw-r--r-- 1 docker docker 24 Dec  1 19:46 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec  1 19:46 test-1764618408110613921
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh cat /mount-9p/test-1764618408110613921
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-428744 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:148: (dbg) Non-zero exit: kubectl --context functional-428744 replace --force -f testdata/busybox-mount-test.yaml: exit status 1 (58.849466ms)

                                                
                                                
** stderr ** 
	error: error when deleting "testdata/busybox-mount-test.yaml": Delete "https://192.168.49.2:8441/api/v1/namespaces/default/pods/busybox-mount": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test_mount_test.go:150: failed to 'kubectl replace' for busybox-mount-test. args "kubectl --context functional-428744 replace --force -f testdata/busybox-mount-test.yaml" : exit status 1
functional_test_mount_test.go:80: "TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:81: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:81: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates": exit status 1 (292.441428ms)

                                                
                                                
-- stdout --
	192.168.49.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=997,access=any,msize=262144,trans=tcp,noextend,port=33403)
	total 2
	-rw-r--r-- 1 docker docker 24 Dec  1 19:46 created-by-test
	-rw-r--r-- 1 docker docker 24 Dec  1 19:46 created-by-test-removed-by-pod
	-rw-r--r-- 1 docker docker 24 Dec  1 19:46 test-1764618408110613921
	cat: /mount-9p/pod-dates: No such file or directory

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:83: debugging command "out/minikube-linux-arm64 -p functional-428744 ssh \"mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates\"" failed : exit status 1
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.49.1:33403
* Userspace file server: 
ufs starting
* Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                

                                                
                                                
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001:/mount-9p --alsologtostderr -v=1] stderr:
I1201 19:46:48.175694   69977 out.go:360] Setting OutFile to fd 1 ...
I1201 19:46:48.175855   69977 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:46:48.175861   69977 out.go:374] Setting ErrFile to fd 2...
I1201 19:46:48.175866   69977 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:46:48.176138   69977 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:46:48.176400   69977 mustload.go:66] Loading cluster: functional-428744
I1201 19:46:48.176755   69977 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:46:48.177303   69977 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:46:48.201424   69977 host.go:66] Checking if "functional-428744" exists ...
I1201 19:46:48.201780   69977 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1201 19:46:48.311821   69977 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:48.300618115 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path
:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1201 19:46:48.311985   69977 cli_runner.go:164] Run: docker network inspect functional-428744 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1201 19:46:48.335768   69977 out.go:179] * Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001 into VM as /mount-9p ...
I1201 19:46:48.338935   69977 out.go:179]   - Mount type:   9p
I1201 19:46:48.341894   69977 out.go:179]   - User ID:      docker
I1201 19:46:48.345021   69977 out.go:179]   - Group ID:     docker
I1201 19:46:48.352855   69977 out.go:179]   - Version:      9p2000.L
I1201 19:46:48.355843   69977 out.go:179]   - Message Size: 262144
I1201 19:46:48.358954   69977 out.go:179]   - Options:      map[]
I1201 19:46:48.362282   69977 out.go:179]   - Bind Address: 192.168.49.1:33403
I1201 19:46:48.365168   69977 out.go:179] * Userspace file server: 
I1201 19:46:48.365567   69977 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1201 19:46:48.365689   69977 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:46:48.387301   69977 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
I1201 19:46:48.492313   69977 mount.go:180] unmount for /mount-9p ran successfully
I1201 19:46:48.492347   69977 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I1201 19:46:48.500891   69977 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=33403,trans=tcp,version=9p2000.L 192.168.49.1 /mount-9p"
I1201 19:46:48.511341   69977 main.go:127] stdlog: ufs.go:141 connected
I1201 19:46:48.511499   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tversion tag 65535 msize 262144 version '9P2000.L'
I1201 19:46:48.511538   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rversion tag 65535 msize 262144 version '9P2000'
I1201 19:46:48.511775   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I1201 19:46:48.511838   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rattach tag 0 aqid (4433e db7410aa 'd')
I1201 19:46:48.512511   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 0
I1201 19:46:48.512589   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (4433e db7410aa 'd') m d775 at 0 mt 1764618408 l 4096 t 0 d 0 ext )
I1201 19:46:48.521541   69977 lock.go:50] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/.mount-process: {Name:mk9279aa892caab166b91e371f8617094275b2cd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1201 19:46:48.521794   69977 mount.go:105] mount successful: ""
I1201 19:46:48.525328   69977 out.go:179] * Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4165804553/001 to /mount-9p
I1201 19:46:48.528277   69977 out.go:203] 
I1201 19:46:48.531102   69977 out.go:179] * NOTE: This process must stay alive for the mount to be accessible ...
I1201 19:46:49.262078   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 0
I1201 19:46:49.262154   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (4433e db7410aa 'd') m d775 at 0 mt 1764618408 l 4096 t 0 d 0 ext )
I1201 19:46:49.262496   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 1 
I1201 19:46:49.262528   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 
I1201 19:46:49.262663   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Topen tag 0 fid 1 mode 0
I1201 19:46:49.262716   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Ropen tag 0 qid (4433e db7410aa 'd') iounit 0
I1201 19:46:49.262840   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 0
I1201 19:46:49.262872   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (4433e db7410aa 'd') m d775 at 0 mt 1764618408 l 4096 t 0 d 0 ext )
I1201 19:46:49.263033   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 0 count 262120
I1201 19:46:49.263144   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 258
I1201 19:46:49.263272   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 258 count 261862
I1201 19:46:49.263299   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.263443   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 258 count 262120
I1201 19:46:49.263471   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.263619   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1201 19:46:49.263648   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44340 db7410aa '') 
I1201 19:46:49.263784   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.263821   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (44340 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.263940   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.263976   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (44340 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.264110   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.264137   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.264266   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 2 0:'test-1764618408110613921' 
I1201 19:46:49.264297   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44342 db7410aa '') 
I1201 19:46:49.264423   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.264453   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('test-1764618408110613921' 'jenkins' 'jenkins' '' q (44342 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.264566   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.264593   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('test-1764618408110613921' 'jenkins' 'jenkins' '' q (44342 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.264714   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.264734   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.264863   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1201 19:46:49.264897   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44341 db7410aa '') 
I1201 19:46:49.265019   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.265051   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (44341 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.265167   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.265202   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (44341 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.265342   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.265362   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.265471   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 258 count 262120
I1201 19:46:49.265526   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.265651   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 1
I1201 19:46:49.265682   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.562470   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 1 0:'test-1764618408110613921' 
I1201 19:46:49.562539   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44342 db7410aa '') 
I1201 19:46:49.562728   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 1
I1201 19:46:49.562785   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('test-1764618408110613921' 'jenkins' 'jenkins' '' q (44342 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.562955   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 1 newfid 2 
I1201 19:46:49.562988   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 
I1201 19:46:49.563123   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Topen tag 0 fid 2 mode 0
I1201 19:46:49.563175   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Ropen tag 0 qid (44342 db7410aa '') iounit 0
I1201 19:46:49.563312   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 1
I1201 19:46:49.563346   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('test-1764618408110613921' 'jenkins' 'jenkins' '' q (44342 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.563514   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 2 offset 0 count 262120
I1201 19:46:49.563556   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 24
I1201 19:46:49.563707   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 2 offset 24 count 262120
I1201 19:46:49.563736   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.563902   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 2 offset 24 count 262120
I1201 19:46:49.563952   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.564162   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.564208   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.564382   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 1
I1201 19:46:49.564419   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.916729   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 0
I1201 19:46:49.916805   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (4433e db7410aa 'd') m d775 at 0 mt 1764618408 l 4096 t 0 d 0 ext )
I1201 19:46:49.917171   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 1 
I1201 19:46:49.917211   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 
I1201 19:46:49.917365   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Topen tag 0 fid 1 mode 0
I1201 19:46:49.917439   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Ropen tag 0 qid (4433e db7410aa 'd') iounit 0
I1201 19:46:49.917591   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 0
I1201 19:46:49.917634   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (4433e db7410aa 'd') m d775 at 0 mt 1764618408 l 4096 t 0 d 0 ext )
I1201 19:46:49.917794   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 0 count 262120
I1201 19:46:49.917891   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 258
I1201 19:46:49.918016   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 258 count 261862
I1201 19:46:49.918043   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.918197   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 258 count 262120
I1201 19:46:49.918226   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.918350   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1201 19:46:49.918383   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44340 db7410aa '') 
I1201 19:46:49.918510   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.918560   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (44340 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.918687   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.918718   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (44340 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.918852   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.918873   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.918999   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 2 0:'test-1764618408110613921' 
I1201 19:46:49.919027   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44342 db7410aa '') 
I1201 19:46:49.919151   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.919180   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('test-1764618408110613921' 'jenkins' 'jenkins' '' q (44342 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.919306   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.919340   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('test-1764618408110613921' 'jenkins' 'jenkins' '' q (44342 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.919467   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.919488   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.919626   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1201 19:46:49.919659   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rwalk tag 0 (44341 db7410aa '') 
I1201 19:46:49.919780   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.919814   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (44341 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.919946   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tstat tag 0 fid 2
I1201 19:46:49.920005   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (44341 db7410aa '') m 644 at 0 mt 1764618408 l 24 t 0 d 0 ext )
I1201 19:46:49.920150   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 2
I1201 19:46:49.920190   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.920306   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tread tag 0 fid 1 offset 258 count 262120
I1201 19:46:49.920336   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rread tag 0 count 0
I1201 19:46:49.920487   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 1
I1201 19:46:49.920526   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:49.921837   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I1201 19:46:49.921912   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rerror tag 0 ename 'file not found' ecode 0
I1201 19:46:50.215267   69977 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:44880 Tclunk tag 0 fid 0
I1201 19:46:50.215321   69977 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:44880 Rclunk tag 0
I1201 19:46:50.216384   69977 main.go:127] stdlog: ufs.go:147 disconnected
I1201 19:46:50.237846   69977 out.go:179] * Unmounting /mount-9p ...
I1201 19:46:50.240888   69977 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1201 19:46:50.248105   69977 mount.go:180] unmount for /mount-9p ran successfully
I1201 19:46:50.248225   69977 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/.mount-process: {Name:mk9279aa892caab166b91e371f8617094275b2cd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1201 19:46:50.251356   69977 out.go:203] 
W1201 19:46:50.254279   69977 out.go:285] X Exiting due to MK_INTERRUPTED: Received terminated signal
X Exiting due to MK_INTERRUPTED: Received terminated signal
I1201 19:46:50.257142   69977 out.go:203] 
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port (2.23s)

                                                
                                    
x
+
TestKubernetesUpgrade (794.27s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-arm64 start -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (37.517750217s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-arm64 stop -p kubernetes-upgrade-846544
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-arm64 stop -p kubernetes-upgrade-846544: (1.356379928s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-846544 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-arm64 -p kubernetes-upgrade-846544 status --format={{.Host}}: exit status 7 (71.164709ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: exit status 109 (12m30.704954797s)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-846544] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "kubernetes-upgrade-846544" primary control-plane node in "kubernetes-upgrade-846544" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 20:17:43.147962  199924 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:17:43.148082  199924 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:17:43.148092  199924 out.go:374] Setting ErrFile to fd 2...
	I1201 20:17:43.148098  199924 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:17:43.148451  199924 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:17:43.148866  199924 out.go:368] Setting JSON to false
	I1201 20:17:43.149735  199924 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":7215,"bootTime":1764613049,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 20:17:43.149819  199924 start.go:143] virtualization:  
	I1201 20:17:43.152954  199924 out.go:179] * [kubernetes-upgrade-846544] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 20:17:43.156821  199924 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 20:17:43.156967  199924 notify.go:221] Checking for updates...
	I1201 20:17:43.162681  199924 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 20:17:43.165636  199924 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 20:17:43.168688  199924 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 20:17:43.171709  199924 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 20:17:43.174594  199924 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 20:17:43.177931  199924 config.go:182] Loaded profile config "kubernetes-upgrade-846544": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.28.0
	I1201 20:17:43.178570  199924 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 20:17:43.211070  199924 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 20:17:43.211175  199924 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 20:17:43.274361  199924 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:0 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:24 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-01 20:17:43.265059137 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 20:17:43.274466  199924 docker.go:319] overlay module found
	I1201 20:17:43.277626  199924 out.go:179] * Using the docker driver based on existing profile
	I1201 20:17:43.280493  199924 start.go:309] selected driver: docker
	I1201 20:17:43.280533  199924 start.go:927] validating driver "docker" against &{Name:kubernetes-upgrade-846544 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:kubernetes-upgrade-846544 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cu
stomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 20:17:43.280625  199924 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 20:17:43.281331  199924 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 20:17:43.343831  199924 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:0 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:24 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-01 20:17:43.334407068 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 20:17:43.344173  199924 cni.go:84] Creating CNI manager for ""
	I1201 20:17:43.344241  199924 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 20:17:43.344286  199924 start.go:353] cluster config:
	{Name:kubernetes-upgrade-846544 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-846544 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain
:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: Stat
icIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 20:17:43.347530  199924 out.go:179] * Starting "kubernetes-upgrade-846544" primary control-plane node in "kubernetes-upgrade-846544" cluster
	I1201 20:17:43.350480  199924 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 20:17:43.353627  199924 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 20:17:43.356498  199924 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 20:17:43.356584  199924 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 20:17:43.378598  199924 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 20:17:43.378621  199924 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1201 20:17:43.408880  199924 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1201 20:17:43.573611  199924 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1201 20:17:43.573748  199924 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/config.json ...
	I1201 20:17:43.573916  199924 cache.go:107] acquiring lock: {Name:mk5a09122d02521ef34c52b7e36a585d52fd9f21 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574001  199924 cache.go:243] Successfully downloaded all kic artifacts
	I1201 20:17:43.574008  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1201 20:17:43.574018  199924 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.905µs
	I1201 20:17:43.574033  199924 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1201 20:17:43.574029  199924 start.go:360] acquireMachinesLock for kubernetes-upgrade-846544: {Name:mk9ea7a890a2c4ec35e5d0c1d60666b6878e2e39 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574044  199924 cache.go:107] acquiring lock: {Name:mka1ee62c1593bc03c858a35d26f9e2c2b690f2f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574073  199924 start.go:364] duration metric: took 30.679µs to acquireMachinesLock for "kubernetes-upgrade-846544"
	I1201 20:17:43.574078  199924 cache.go:107] acquiring lock: {Name:mk1428d1cd9ff7a81b7e1db938ad2d6c63d6f0a0 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574091  199924 start.go:96] Skipping create...Using existing machine configuration
	I1201 20:17:43.574100  199924 fix.go:54] fixHost starting: 
	I1201 20:17:43.574124  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1201 20:17:43.574131  199924 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 55.385µs
	I1201 20:17:43.574138  199924 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1201 20:17:43.574160  199924 cache.go:107] acquiring lock: {Name:mkad6a428cd1ee354a7cfbf702340281bc69cb07 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574191  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1201 20:17:43.574196  199924 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 37.867µs
	I1201 20:17:43.574202  199924 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1201 20:17:43.574215  199924 cache.go:107] acquiring lock: {Name:mkb18b26fcf663181a7a99db9cd5b6336ab8823e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574242  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1201 20:17:43.574247  199924 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 33.419µs
	I1201 20:17:43.574253  199924 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1201 20:17:43.574262  199924 cache.go:107] acquiring lock: {Name:mk38216f1b78a5797f7bdaa5dbcc21785e81c0b1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574289  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1201 20:17:43.574293  199924 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 32.14µs
	I1201 20:17:43.574299  199924 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1201 20:17:43.574307  199924 cache.go:107] acquiring lock: {Name:mk0ada5a1c55cc71f4e4c3bcb210275f4c579244 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574331  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1201 20:17:43.574337  199924 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 29.654µs
	I1201 20:17:43.574343  199924 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1201 20:17:43.574351  199924 cache.go:107] acquiring lock: {Name:mk5372b924cb355fb27744bf15e6650674c1123b Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:17:43.574376  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1201 20:17:43.574381  199924 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 31.401µs
	I1201 20:17:43.574394  199924 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1201 20:17:43.574376  199924 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-846544 --format={{.State.Status}}
	I1201 20:17:43.574396  199924 cache.go:115] /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1201 20:17:43.574534  199924 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 481.055µs
	I1201 20:17:43.574553  199924 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1201 20:17:43.574562  199924 cache.go:87] Successfully saved all images to host disk.
	I1201 20:17:43.591125  199924 fix.go:112] recreateIfNeeded on kubernetes-upgrade-846544: state=Stopped err=<nil>
	W1201 20:17:43.591168  199924 fix.go:138] unexpected machine state, will restart: <nil>
	I1201 20:17:43.594586  199924 out.go:252] * Restarting existing docker container for "kubernetes-upgrade-846544" ...
	I1201 20:17:43.594682  199924 cli_runner.go:164] Run: docker start kubernetes-upgrade-846544
	I1201 20:17:43.910084  199924 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-846544 --format={{.State.Status}}
	I1201 20:17:43.929621  199924 kic.go:430] container "kubernetes-upgrade-846544" state is running.
	I1201 20:17:43.930018  199924 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-846544
	I1201 20:17:43.953382  199924 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/config.json ...
	I1201 20:17:43.953834  199924 machine.go:94] provisionDockerMachine start ...
	I1201 20:17:43.953981  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:43.977670  199924 main.go:143] libmachine: Using SSH client type: native
	I1201 20:17:43.978504  199924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33008 <nil> <nil>}
	I1201 20:17:43.978535  199924 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 20:17:43.979287  199924 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:43966->127.0.0.1:33008: read: connection reset by peer
	I1201 20:17:47.149287  199924 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-846544
	
	I1201 20:17:47.149360  199924 ubuntu.go:182] provisioning hostname "kubernetes-upgrade-846544"
	I1201 20:17:47.149458  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:47.170959  199924 main.go:143] libmachine: Using SSH client type: native
	I1201 20:17:47.171259  199924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33008 <nil> <nil>}
	I1201 20:17:47.171270  199924 main.go:143] libmachine: About to run SSH command:
	sudo hostname kubernetes-upgrade-846544 && echo "kubernetes-upgrade-846544" | sudo tee /etc/hostname
	I1201 20:17:47.343858  199924 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-846544
	
	I1201 20:17:47.343936  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:47.365048  199924 main.go:143] libmachine: Using SSH client type: native
	I1201 20:17:47.365357  199924 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33008 <nil> <nil>}
	I1201 20:17:47.365373  199924 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\skubernetes-upgrade-846544' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 kubernetes-upgrade-846544/g' /etc/hosts;
				else 
					echo '127.0.1.1 kubernetes-upgrade-846544' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 20:17:47.529973  199924 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 20:17:47.530062  199924 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 20:17:47.530111  199924 ubuntu.go:190] setting up certificates
	I1201 20:17:47.530140  199924 provision.go:84] configureAuth start
	I1201 20:17:47.530240  199924 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-846544
	I1201 20:17:47.563060  199924 provision.go:143] copyHostCerts
	I1201 20:17:47.563142  199924 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 20:17:47.563159  199924 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 20:17:47.563238  199924 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 20:17:47.563330  199924 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 20:17:47.563346  199924 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 20:17:47.563374  199924 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 20:17:47.563427  199924 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 20:17:47.563437  199924 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 20:17:47.563463  199924 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 20:17:47.563513  199924 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.kubernetes-upgrade-846544 san=[127.0.0.1 192.168.76.2 kubernetes-upgrade-846544 localhost minikube]
	I1201 20:17:47.667640  199924 provision.go:177] copyRemoteCerts
	I1201 20:17:47.667794  199924 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 20:17:47.667856  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:47.694535  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:47.805587  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1201 20:17:47.836771  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1201 20:17:47.859867  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 20:17:47.881146  199924 provision.go:87] duration metric: took 350.962499ms to configureAuth
	I1201 20:17:47.881227  199924 ubuntu.go:206] setting minikube options for container-runtime
	I1201 20:17:47.881447  199924 config.go:182] Loaded profile config "kubernetes-upgrade-846544": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 20:17:47.881461  199924 machine.go:97] duration metric: took 3.927571303s to provisionDockerMachine
	I1201 20:17:47.881469  199924 start.go:293] postStartSetup for "kubernetes-upgrade-846544" (driver="docker")
	I1201 20:17:47.881482  199924 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 20:17:47.881551  199924 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 20:17:47.881599  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:47.901086  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:48.006439  199924 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 20:17:48.014375  199924 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 20:17:48.014401  199924 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 20:17:48.014414  199924 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 20:17:48.014482  199924 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 20:17:48.014579  199924 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 20:17:48.014686  199924 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1201 20:17:48.025098  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 20:17:48.047522  199924 start.go:296] duration metric: took 166.037389ms for postStartSetup
	I1201 20:17:48.047687  199924 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 20:17:48.047760  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:48.067158  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:48.170611  199924 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 20:17:48.176946  199924 fix.go:56] duration metric: took 4.602841581s for fixHost
	I1201 20:17:48.176972  199924 start.go:83] releasing machines lock for "kubernetes-upgrade-846544", held for 4.602891214s
	I1201 20:17:48.177042  199924 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-846544
	I1201 20:17:48.195521  199924 ssh_runner.go:195] Run: cat /version.json
	I1201 20:17:48.195544  199924 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 20:17:48.195572  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:48.195610  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:48.223032  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:48.246919  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:48.426032  199924 ssh_runner.go:195] Run: systemctl --version
	I1201 20:17:48.432781  199924 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 20:17:48.437894  199924 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 20:17:48.437958  199924 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 20:17:48.446740  199924 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1201 20:17:48.446764  199924 start.go:496] detecting cgroup driver to use...
	I1201 20:17:48.446816  199924 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 20:17:48.446868  199924 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 20:17:48.465329  199924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 20:17:48.480235  199924 docker.go:218] disabling cri-docker service (if available) ...
	I1201 20:17:48.480311  199924 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 20:17:48.495720  199924 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 20:17:48.508442  199924 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 20:17:48.626979  199924 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 20:17:48.754182  199924 docker.go:234] disabling docker service ...
	I1201 20:17:48.754299  199924 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 20:17:48.770446  199924 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 20:17:48.783767  199924 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 20:17:48.895416  199924 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 20:17:49.021402  199924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 20:17:49.036005  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 20:17:49.056611  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 20:17:49.069366  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 20:17:49.082194  199924 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 20:17:49.082275  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 20:17:49.096187  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 20:17:49.110584  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 20:17:49.122007  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 20:17:49.131854  199924 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 20:17:49.144225  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 20:17:49.155470  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 20:17:49.165864  199924 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 20:17:49.179312  199924 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 20:17:49.189108  199924 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 20:17:49.201260  199924 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 20:17:49.380227  199924 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 20:17:49.573921  199924 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 20:17:49.573987  199924 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 20:17:49.578964  199924 start.go:564] Will wait 60s for crictl version
	I1201 20:17:49.579021  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:49.585583  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 20:17:49.636599  199924 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 20:17:49.636680  199924 ssh_runner.go:195] Run: containerd --version
	I1201 20:17:49.676969  199924 ssh_runner.go:195] Run: containerd --version
	I1201 20:17:49.723025  199924 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1201 20:17:49.726196  199924 cli_runner.go:164] Run: docker network inspect kubernetes-upgrade-846544 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 20:17:49.761826  199924 ssh_runner.go:195] Run: grep 192.168.76.1	host.minikube.internal$ /etc/hosts
	I1201 20:17:49.765761  199924 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.76.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1201 20:17:49.788995  199924 kubeadm.go:884] updating cluster {Name:kubernetes-upgrade-846544 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-846544 Namespace:default APIServerHAVIP: APISe
rverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Custo
mQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 20:17:49.789095  199924 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1201 20:17:49.789148  199924 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 20:17:49.834237  199924 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1201 20:17:49.834264  199924 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.6.5-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1201 20:17:49.834336  199924 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:49.834564  199924 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:49.834730  199924 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:49.834912  199924 image.go:138] retrieving image: registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:49.835022  199924 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:49.835118  199924 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1201 20:17:49.835594  199924 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:49.835854  199924 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:49.836835  199924 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:49.837124  199924 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 20:17:49.837299  199924 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:49.837460  199924 image.go:181] daemon lookup for registry.k8s.io/etcd:3.6.5-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:49.837651  199924 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1201 20:17:49.837790  199924 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:49.838283  199924 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:49.841959  199924 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 20:17:50.194131  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.6.5-0" and sha "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42"
	I1201 20:17:50.194214  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:50.248130  199924 cache_images.go:118] "registry.k8s.io/etcd:3.6.5-0" needs transfer: "registry.k8s.io/etcd:3.6.5-0" does not exist at hash "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42" in container runtime
	I1201 20:17:50.248167  199924 cri.go:218] Removing image: registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:50.248213  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.257064  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1201 20:17:50.257127  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:50.265839  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:50.276095  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1201 20:17:50.276597  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:50.276361  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1201 20:17:50.276871  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:50.284299  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1201 20:17:50.284374  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1201 20:17:50.292474  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1201 20:17:50.292543  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:50.294542  199924 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1201 20:17:50.294620  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:50.392277  199924 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1201 20:17:50.392334  199924 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:50.392388  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.435397  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:50.490069  199924 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1201 20:17:50.490110  199924 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:50.490167  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.490210  199924 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1201 20:17:50.490226  199924 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:50.490245  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.490305  199924 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1201 20:17:50.490318  199924 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1201 20:17:50.490337  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.490380  199924 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1201 20:17:50.490392  199924 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:50.490410  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.490459  199924 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1201 20:17:50.490471  199924 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:50.490491  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:50.490545  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:50.534964  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1201 20:17:50.577312  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:50.577415  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:50.577525  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:50.577609  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1201 20:17:50.577681  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:50.577762  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:50.635097  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0
	I1201 20:17:50.635250  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0
	I1201 20:17:50.743958  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:50.744032  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1201 20:17:50.744087  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:50.744139  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:50.744192  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1201 20:17:50.744267  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:50.744315  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.6.5-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.6.5-0': No such file or directory
	I1201 20:17:50.744333  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 --> /var/lib/minikube/images/etcd_3.6.5-0 (21148160 bytes)
	I1201 20:17:50.928165  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1201 20:17:50.928318  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1201 20:17:50.928331  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1201 20:17:50.928372  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1201 20:17:50.928428  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1201 20:17:50.928432  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1201 20:17:50.928467  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1201 20:17:51.091575  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1201 20:17:51.091697  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1201 20:17:51.091811  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1201 20:17:51.091833  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1201 20:17:51.091907  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1201 20:17:51.091987  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1201 20:17:51.092067  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1201 20:17:51.092147  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1201 20:17:51.092226  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1201 20:17:51.092293  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1201 20:17:51.092368  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1201 20:17:51.092442  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	W1201 20:17:51.142505  199924 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1201 20:17:51.142783  199924 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1201 20:17:51.142870  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 20:17:51.155306  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1201 20:17:51.155350  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1201 20:17:51.155591  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1201 20:17:51.155613  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	I1201 20:17:51.155687  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1201 20:17:51.155716  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1201 20:17:51.155784  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1201 20:17:51.155798  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1201 20:17:51.155900  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1201 20:17:51.155917  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	W1201 20:17:51.169714  199924 ssh_runner.go:129] session error, resetting client: ssh: rejected: connect failed (open failed)
	I1201 20:17:51.169754  199924 retry.go:31] will retry after 133.34414ms: ssh: rejected: connect failed (open failed)
	W1201 20:17:51.169793  199924 ssh_runner.go:129] session error, resetting client: ssh: rejected: connect failed (open failed)
	I1201 20:17:51.170028  199924 retry.go:31] will retry after 312.295271ms: ssh: rejected: connect failed (open failed)
	I1201 20:17:51.228538  199924 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1201 20:17:51.228640  199924 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 20:17:51.228714  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:17:51.228820  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:51.262430  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:51.303456  199924 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-846544
	I1201 20:17:51.391456  199924 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33008 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/kubernetes-upgrade-846544/id_rsa Username:docker}
	I1201 20:17:51.479907  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1201 20:17:51.628054  199924 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.6.5-0
	I1201 20:17:51.628172  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0
	I1201 20:17:51.638206  199924 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1201 20:17:51.638351  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1201 20:17:53.203675  199924 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0: (1.575457789s)
	I1201 20:17:53.203791  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 from cache
	I1201 20:17:53.203762  199924 ssh_runner.go:235] Completed: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: (1.565376844s)
	I1201 20:17:53.203841  199924 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1201 20:17:53.203928  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1201 20:17:53.203873  199924 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1201 20:17:53.204037  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1201 20:17:54.624014  199924 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: (1.420032962s)
	I1201 20:17:54.624052  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1201 20:17:54.624077  199924 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1201 20:17:54.624124  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	I1201 20:17:54.773081  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	I1201 20:17:54.773118  199924 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1201 20:17:54.773174  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1201 20:17:56.128479  199924 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1: (1.355276789s)
	I1201 20:17:56.128507  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1201 20:17:56.128524  199924 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1201 20:17:56.128569  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1201 20:17:57.570336  199924 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: (1.441739017s)
	I1201 20:17:57.570359  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1201 20:17:57.570378  199924 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1201 20:17:57.570435  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1201 20:17:59.361286  199924 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (1.790824724s)
	I1201 20:17:59.361312  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1201 20:17:59.361337  199924 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1201 20:17:59.361385  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1201 20:18:00.931865  199924 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: (1.570457809s)
	I1201 20:18:00.931895  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1201 20:18:00.931914  199924 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1201 20:18:00.931960  199924 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1201 20:18:01.453558  199924 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/21997-2497/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1201 20:18:01.453595  199924 cache_images.go:125] Successfully loaded all cached images
	I1201 20:18:01.453601  199924 cache_images.go:94] duration metric: took 11.619323972s to LoadCachedImages
	I1201 20:18:01.453613  199924 kubeadm.go:935] updating node { 192.168.76.2 8443 v1.35.0-beta.0 containerd true true} ...
	I1201 20:18:01.453730  199924 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=kubernetes-upgrade-846544 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.76.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-846544 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 20:18:01.453803  199924 ssh_runner.go:195] Run: sudo crictl info
	I1201 20:18:01.482172  199924 cni.go:84] Creating CNI manager for ""
	I1201 20:18:01.482195  199924 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 20:18:01.482216  199924 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 20:18:01.482240  199924 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.76.2 APIServerPort:8443 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:kubernetes-upgrade-846544 NodeName:kubernetes-upgrade-846544 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.76.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.76.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/
certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 20:18:01.482373  199924 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.76.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "kubernetes-upgrade-846544"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.76.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 20:18:01.482442  199924 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 20:18:01.499382  199924 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1201 20:18:01.499523  199924 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1201 20:18:01.519895  199924 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1201 20:18:01.520007  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1201 20:18:01.520083  199924 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1201 20:18:01.520115  199924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 20:18:01.520191  199924 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1201 20:18:01.520241  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1201 20:18:01.560303  199924 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1201 20:18:01.560337  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1201 20:18:01.560396  199924 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1201 20:18:01.560404  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1201 20:18:01.560504  199924 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1201 20:18:01.619325  199924 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1201 20:18:01.619407  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1201 20:18:02.553709  199924 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 20:18:02.563346  199924 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (336 bytes)
	I1201 20:18:02.587189  199924 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1201 20:18:02.604972  199924 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2245 bytes)
	I1201 20:18:02.621255  199924 ssh_runner.go:195] Run: grep 192.168.76.2	control-plane.minikube.internal$ /etc/hosts
	I1201 20:18:02.628616  199924 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.76.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1201 20:18:02.642208  199924 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 20:18:02.818984  199924 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 20:18:02.845022  199924 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544 for IP: 192.168.76.2
	I1201 20:18:02.845045  199924 certs.go:195] generating shared ca certs ...
	I1201 20:18:02.845061  199924 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:18:02.845203  199924 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 20:18:02.845252  199924 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 20:18:02.845264  199924 certs.go:257] generating profile certs ...
	I1201 20:18:02.845359  199924 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/client.key
	I1201 20:18:02.845448  199924 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/apiserver.key.135eaaca
	I1201 20:18:02.845525  199924 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/proxy-client.key
	I1201 20:18:02.845642  199924 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 20:18:02.845680  199924 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 20:18:02.845694  199924 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 20:18:02.845724  199924 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 20:18:02.845757  199924 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 20:18:02.845785  199924 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 20:18:02.845834  199924 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 20:18:02.846420  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 20:18:02.869402  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 20:18:02.891729  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 20:18:02.915131  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 20:18:02.935829  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1201 20:18:02.955759  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1201 20:18:02.975802  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 20:18:02.995783  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1201 20:18:03.020232  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 20:18:03.042375  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 20:18:03.063293  199924 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 20:18:03.081551  199924 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 20:18:03.095185  199924 ssh_runner.go:195] Run: openssl version
	I1201 20:18:03.101889  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 20:18:03.111028  199924 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 20:18:03.115535  199924 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 20:18:03.115622  199924 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 20:18:03.157609  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 20:18:03.165909  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 20:18:03.174503  199924 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 20:18:03.178665  199924 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 20:18:03.178750  199924 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 20:18:03.219901  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 20:18:03.228160  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 20:18:03.236657  199924 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 20:18:03.240564  199924 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 20:18:03.240680  199924 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 20:18:03.283482  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 20:18:03.292134  199924 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 20:18:03.296091  199924 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1201 20:18:03.337701  199924 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1201 20:18:03.380228  199924 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1201 20:18:03.422202  199924 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1201 20:18:03.464720  199924 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1201 20:18:03.506527  199924 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1201 20:18:03.552588  199924 kubeadm.go:401] StartCluster: {Name:kubernetes-upgrade-846544 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-846544 Namespace:default APIServerHAVIP: APIServe
rName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQe
muFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 20:18:03.552717  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 20:18:03.552795  199924 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 20:18:03.579600  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:18:03.579679  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:18:03.579698  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:18:03.579711  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:18:03.579715  199924 cri.go:89] found id: ""
	I1201 20:18:03.579780  199924 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	W1201 20:18:03.605054  199924 kubeadm.go:408] unpause failed: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-01T20:18:03Z" level=error msg="open /run/containerd/runc/k8s.io: no such file or directory"
	I1201 20:18:03.605127  199924 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 20:18:03.614106  199924 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1201 20:18:03.614125  199924 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1201 20:18:03.614194  199924 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1201 20:18:03.623466  199924 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1201 20:18:03.623998  199924 kubeconfig.go:47] verify endpoint returned: get endpoint: "kubernetes-upgrade-846544" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 20:18:03.624241  199924 kubeconfig.go:62] /home/jenkins/minikube-integration/21997-2497/kubeconfig needs updating (will repair): [kubeconfig missing "kubernetes-upgrade-846544" cluster setting kubeconfig missing "kubernetes-upgrade-846544" context setting]
	I1201 20:18:03.624694  199924 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:18:03.625342  199924 kapi.go:59] client config for kubernetes-upgrade-846544: &rest.Config{Host:"https://192.168.76.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/client.crt", KeyFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kubernetes-upgrade-846544/client.key", CAFile:"/home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CA
Data:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1201 20:18:03.626030  199924 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1201 20:18:03.626050  199924 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1201 20:18:03.626058  199924 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1201 20:18:03.626064  199924 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1201 20:18:03.626068  199924 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1201 20:18:03.626343  199924 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1201 20:18:03.637998  199924 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-01 20:17:20.388352516 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-01 20:18:02.616701417 +0000
	@@ -1,4 +1,4 @@
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: InitConfiguration
	 localAPIEndpoint:
	   advertiseAddress: 192.168.76.2
	@@ -14,31 +14,34 @@
	   criSocket: unix:///run/containerd/containerd.sock
	   name: "kubernetes-upgrade-846544"
	   kubeletExtraArgs:
	-    node-ip: 192.168.76.2
	+    - name: "node-ip"
	+      value: "192.168.76.2"
	   taints: []
	 ---
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: ClusterConfiguration
	 apiServer:
	   certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	   extraArgs:
	-    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+    - name: "enable-admission-plugins"
	+      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	 controllerManager:
	   extraArgs:
	-    allocate-node-cidrs: "true"
	-    leader-elect: "false"
	+    - name: "allocate-node-cidrs"
	+      value: "true"
	+    - name: "leader-elect"
	+      value: "false"
	 scheduler:
	   extraArgs:
	-    leader-elect: "false"
	+    - name: "leader-elect"
	+      value: "false"
	 certificatesDir: /var/lib/minikube/certs
	 clusterName: mk
	 controlPlaneEndpoint: control-plane.minikube.internal:8443
	 etcd:
	   local:
	     dataDir: /var/lib/minikube/etcd
	-    extraArgs:
	-      proxy-refresh-interval: "70000"
	-kubernetesVersion: v1.28.0
	+kubernetesVersion: v1.35.0-beta.0
	 networking:
	   dnsDomain: cluster.local
	   podSubnet: "10.244.0.0/16"
	
	-- /stdout --
	I1201 20:18:03.638020  199924 kubeadm.go:1161] stopping kube-system containers ...
	I1201 20:18:03.638033  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1201 20:18:03.638096  199924 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 20:18:03.674483  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:18:03.674547  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:18:03.674570  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:18:03.674582  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:18:03.674586  199924 cri.go:89] found id: ""
	I1201 20:18:03.674592  199924 cri.go:252] Stopping containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:18:03.674647  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:18:03.679393  199924 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl stop --timeout=10 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26
	I1201 20:18:03.718963  199924 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1201 20:18:03.734478  199924 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 20:18:03.742712  199924 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5639 Dec  1 20:17 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5656 Dec  1 20:17 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Dec  1 20:17 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5600 Dec  1 20:17 /etc/kubernetes/scheduler.conf
	
	I1201 20:18:03.742817  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1201 20:18:03.750754  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1201 20:18:03.760583  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1201 20:18:03.769461  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 20:18:03.769562  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 20:18:03.778336  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1201 20:18:03.786166  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1201 20:18:03.786228  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 20:18:03.793876  199924 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 20:18:03.801955  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 20:18:03.845392  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 20:18:05.107426  199924 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.261939579s)
	I1201 20:18:05.107523  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1201 20:18:05.316978  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1201 20:18:05.384370  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1201 20:18:05.429416  199924 api_server.go:52] waiting for apiserver process to appear ...
	I1201 20:18:05.429590  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:05.929662  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:06.429801  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:06.929731  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:07.429719  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:07.929867  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:08.430608  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:08.930539  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:09.429757  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:09.929634  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:10.429819  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:10.929900  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:11.429719  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:11.929729  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:12.430061  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:12.930575  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:13.429789  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:13.930131  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:14.429698  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:14.930678  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:15.430429  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:15.930610  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:16.429810  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:16.930472  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:17.429702  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:17.930520  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:18.429746  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:18.930648  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:19.430695  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:19.929932  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:20.430352  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:20.930419  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:21.430393  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:21.929713  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:22.429961  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:22.929785  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:23.429750  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:23.930646  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:24.429764  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:24.930339  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:25.429766  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:25.929937  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:26.429738  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:26.929940  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:27.429747  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:27.929677  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:28.430134  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:28.930396  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:29.429701  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:29.930318  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:30.429753  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:30.929715  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:31.430394  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:31.930222  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:32.430558  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:32.930437  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:33.430650  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:33.930336  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:34.429710  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:34.930133  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:35.430582  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:35.929749  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:36.430044  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:36.930425  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:37.429779  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:37.929849  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:38.430402  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:38.930682  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:39.429893  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:39.929726  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:40.430415  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:40.929741  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:41.429677  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:41.929682  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:42.430416  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:42.930706  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:43.430107  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:43.930364  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:44.429763  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:44.929939  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:45.429793  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:45.929745  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:46.429794  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:46.930472  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:47.429726  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:47.930423  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:48.430130  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:48.930555  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:49.430338  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:49.930402  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:50.429725  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:50.929751  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:51.429732  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:51.930615  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:52.429713  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:52.930236  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:53.430625  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:53.930203  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:54.430211  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:54.930262  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:55.429653  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:55.930698  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:56.429738  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:56.929648  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:57.429774  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:57.930408  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:58.429715  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:58.929733  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:59.430382  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:18:59.930685  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:00.429655  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:00.930131  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:01.429665  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:01.930686  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:02.429660  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:02.929723  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:03.430197  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:03.930148  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:04.430523  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:04.931181  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:05.429688  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:05.429772  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:05.463701  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:05.463721  199924 cri.go:89] found id: ""
	I1201 20:19:05.463729  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:05.463783  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:05.468214  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:05.468281  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:05.499237  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:05.499256  199924 cri.go:89] found id: ""
	I1201 20:19:05.499264  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:05.499319  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:05.503881  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:05.504000  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:05.539821  199924 cri.go:89] found id: ""
	I1201 20:19:05.539843  199924 logs.go:282] 0 containers: []
	W1201 20:19:05.539851  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:05.539858  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:05.539917  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:05.571983  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:05.572002  199924 cri.go:89] found id: ""
	I1201 20:19:05.572010  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:05.572063  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:05.576369  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:05.576435  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:05.614809  199924 cri.go:89] found id: ""
	I1201 20:19:05.614830  199924 logs.go:282] 0 containers: []
	W1201 20:19:05.614838  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:05.614844  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:05.614902  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:05.667510  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:05.667586  199924 cri.go:89] found id: ""
	I1201 20:19:05.667608  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:05.667690  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:05.677350  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:05.677507  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:05.731573  199924 cri.go:89] found id: ""
	I1201 20:19:05.731639  199924 logs.go:282] 0 containers: []
	W1201 20:19:05.731663  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:05.731681  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:05.731766  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:05.765907  199924 cri.go:89] found id: ""
	I1201 20:19:05.765981  199924 logs.go:282] 0 containers: []
	W1201 20:19:05.766004  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:05.766033  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:05.766068  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:05.832048  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:05.832126  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:05.852161  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:05.852232  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:05.910251  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:05.910280  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:06.001788  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:06.001808  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:06.001821  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:06.069069  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:06.069148  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:06.104689  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:06.104760  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:06.150349  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:06.150380  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:06.196637  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:06.196672  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:08.741305  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:08.752944  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:08.753016  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:08.780993  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:08.781014  199924 cri.go:89] found id: ""
	I1201 20:19:08.781022  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:08.781077  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:08.785046  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:08.785118  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:08.810985  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:08.811009  199924 cri.go:89] found id: ""
	I1201 20:19:08.811024  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:08.811079  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:08.815086  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:08.815160  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:08.852116  199924 cri.go:89] found id: ""
	I1201 20:19:08.852144  199924 logs.go:282] 0 containers: []
	W1201 20:19:08.852153  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:08.852160  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:08.852228  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:08.894480  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:08.894504  199924 cri.go:89] found id: ""
	I1201 20:19:08.894512  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:08.894567  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:08.899257  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:08.899334  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:08.932041  199924 cri.go:89] found id: ""
	I1201 20:19:08.932065  199924 logs.go:282] 0 containers: []
	W1201 20:19:08.932074  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:08.932081  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:08.932145  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:08.960213  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:08.960236  199924 cri.go:89] found id: ""
	I1201 20:19:08.960244  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:08.960300  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:08.964279  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:08.964351  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:08.990597  199924 cri.go:89] found id: ""
	I1201 20:19:08.990624  199924 logs.go:282] 0 containers: []
	W1201 20:19:08.990633  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:08.990640  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:08.990701  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:09.025760  199924 cri.go:89] found id: ""
	I1201 20:19:09.025785  199924 logs.go:282] 0 containers: []
	W1201 20:19:09.025794  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:09.025809  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:09.025821  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:09.084276  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:09.084312  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:09.160127  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:09.160149  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:09.160163  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:09.196776  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:09.196809  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:09.234054  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:09.234087  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:09.268151  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:09.268189  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:09.281341  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:09.281370  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:09.315578  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:09.315610  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:09.354048  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:09.354083  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:11.886710  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:11.898546  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:11.898618  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:11.929601  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:11.929620  199924 cri.go:89] found id: ""
	I1201 20:19:11.929628  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:11.929683  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:11.934291  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:11.934367  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:11.964888  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:11.964908  199924 cri.go:89] found id: ""
	I1201 20:19:11.964916  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:11.964969  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:11.968795  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:11.968861  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:12.000744  199924 cri.go:89] found id: ""
	I1201 20:19:12.000771  199924 logs.go:282] 0 containers: []
	W1201 20:19:12.000783  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:12.000792  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:12.000854  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:12.031351  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:12.031375  199924 cri.go:89] found id: ""
	I1201 20:19:12.031384  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:12.031438  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:12.035656  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:12.035734  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:12.071921  199924 cri.go:89] found id: ""
	I1201 20:19:12.071944  199924 logs.go:282] 0 containers: []
	W1201 20:19:12.071952  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:12.071959  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:12.072020  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:12.114874  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:12.114893  199924 cri.go:89] found id: ""
	I1201 20:19:12.114901  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:12.114946  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:12.133426  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:12.133563  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:12.184099  199924 cri.go:89] found id: ""
	I1201 20:19:12.184120  199924 logs.go:282] 0 containers: []
	W1201 20:19:12.184129  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:12.184135  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:12.184189  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:12.209152  199924 cri.go:89] found id: ""
	I1201 20:19:12.209173  199924 logs.go:282] 0 containers: []
	W1201 20:19:12.209181  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:12.209194  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:12.209206  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:12.222811  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:12.222841  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:12.310060  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:12.310079  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:12.310092  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:12.352061  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:12.352137  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:12.395269  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:12.395303  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:12.432062  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:12.432095  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:12.472164  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:12.472255  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:12.525091  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:12.525169  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:12.571641  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:12.571666  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:15.143829  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:15.165511  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:15.165590  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:15.194223  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:15.194244  199924 cri.go:89] found id: ""
	I1201 20:19:15.194252  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:15.194306  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:15.198607  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:15.198676  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:15.229756  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:15.229781  199924 cri.go:89] found id: ""
	I1201 20:19:15.229790  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:15.229852  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:15.234266  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:15.234336  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:15.266888  199924 cri.go:89] found id: ""
	I1201 20:19:15.266961  199924 logs.go:282] 0 containers: []
	W1201 20:19:15.266984  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:15.267007  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:15.267096  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:15.299673  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:15.299692  199924 cri.go:89] found id: ""
	I1201 20:19:15.299700  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:15.299760  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:15.304111  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:15.304190  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:15.338681  199924 cri.go:89] found id: ""
	I1201 20:19:15.338703  199924 logs.go:282] 0 containers: []
	W1201 20:19:15.338712  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:15.338718  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:15.338776  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:15.371194  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:15.371213  199924 cri.go:89] found id: ""
	I1201 20:19:15.371220  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:15.371273  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:15.375388  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:15.375502  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:15.402855  199924 cri.go:89] found id: ""
	I1201 20:19:15.402927  199924 logs.go:282] 0 containers: []
	W1201 20:19:15.402952  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:15.402971  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:15.403041  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:15.433299  199924 cri.go:89] found id: ""
	I1201 20:19:15.433374  199924 logs.go:282] 0 containers: []
	W1201 20:19:15.433397  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:15.433427  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:15.433471  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:15.504961  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:15.505049  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:15.547789  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:15.547860  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:15.598563  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:15.598635  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:15.648924  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:15.648999  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:15.668836  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:15.668911  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:15.763703  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:15.763916  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:15.763958  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:15.801194  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:15.801268  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:15.847481  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:15.847555  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:18.421204  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:18.433322  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:18.433401  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:18.464251  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:18.464271  199924 cri.go:89] found id: ""
	I1201 20:19:18.464279  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:18.464343  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:18.470710  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:18.470804  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:18.515832  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:18.515903  199924 cri.go:89] found id: ""
	I1201 20:19:18.515927  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:18.516010  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:18.520335  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:18.520403  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:18.549033  199924 cri.go:89] found id: ""
	I1201 20:19:18.549055  199924 logs.go:282] 0 containers: []
	W1201 20:19:18.549063  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:18.549070  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:18.549134  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:18.583235  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:18.583485  199924 cri.go:89] found id: ""
	I1201 20:19:18.583509  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:18.583568  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:18.601869  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:18.602017  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:18.634934  199924 cri.go:89] found id: ""
	I1201 20:19:18.635007  199924 logs.go:282] 0 containers: []
	W1201 20:19:18.635029  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:18.635050  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:18.635163  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:18.676974  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:18.676997  199924 cri.go:89] found id: ""
	I1201 20:19:18.677006  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:18.677067  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:18.681698  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:18.681772  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:18.723811  199924 cri.go:89] found id: ""
	I1201 20:19:18.723836  199924 logs.go:282] 0 containers: []
	W1201 20:19:18.723845  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:18.723851  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:18.723910  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:18.750765  199924 cri.go:89] found id: ""
	I1201 20:19:18.750790  199924 logs.go:282] 0 containers: []
	W1201 20:19:18.750799  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:18.750815  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:18.750828  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:18.785726  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:18.785798  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:18.838773  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:18.838850  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:18.918700  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:18.918778  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:18.987270  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:18.987303  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:19.028162  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:19.028193  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:19.096589  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:19.096623  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:19.110946  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:19.110974  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:19.191401  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:19.191425  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:19.191441  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:21.732949  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:21.744613  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:21.744682  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:21.782259  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:21.782282  199924 cri.go:89] found id: ""
	I1201 20:19:21.782290  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:21.782343  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:21.788933  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:21.789005  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:21.819798  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:21.819831  199924 cri.go:89] found id: ""
	I1201 20:19:21.819839  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:21.819894  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:21.824568  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:21.824643  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:21.875603  199924 cri.go:89] found id: ""
	I1201 20:19:21.875630  199924 logs.go:282] 0 containers: []
	W1201 20:19:21.875640  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:21.875651  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:21.875721  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:21.923918  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:21.923943  199924 cri.go:89] found id: ""
	I1201 20:19:21.923952  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:21.924007  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:21.928426  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:21.928508  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:21.975940  199924 cri.go:89] found id: ""
	I1201 20:19:21.975969  199924 logs.go:282] 0 containers: []
	W1201 20:19:21.975978  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:21.975984  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:21.976039  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:22.008826  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:22.008854  199924 cri.go:89] found id: ""
	I1201 20:19:22.008863  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:22.008925  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:22.014504  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:22.014589  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:22.055669  199924 cri.go:89] found id: ""
	I1201 20:19:22.055698  199924 logs.go:282] 0 containers: []
	W1201 20:19:22.055707  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:22.055714  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:22.055772  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:22.083976  199924 cri.go:89] found id: ""
	I1201 20:19:22.083999  199924 logs.go:282] 0 containers: []
	W1201 20:19:22.084007  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:22.084023  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:22.084035  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:22.167482  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:22.167500  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:22.167516  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:22.203443  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:22.203513  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:22.237653  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:22.237680  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:22.298499  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:22.298532  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:22.318516  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:22.318541  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:22.365212  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:22.365285  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:22.416514  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:22.416589  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:22.477975  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:22.478045  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:25.019997  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:25.032647  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:25.032715  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:25.060703  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:25.060723  199924 cri.go:89] found id: ""
	I1201 20:19:25.060731  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:25.060788  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:25.064587  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:25.064660  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:25.094217  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:25.094238  199924 cri.go:89] found id: ""
	I1201 20:19:25.094247  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:25.094310  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:25.098504  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:25.098585  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:25.125179  199924 cri.go:89] found id: ""
	I1201 20:19:25.125202  199924 logs.go:282] 0 containers: []
	W1201 20:19:25.125210  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:25.125217  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:25.125302  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:25.151351  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:25.151376  199924 cri.go:89] found id: ""
	I1201 20:19:25.151385  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:25.151446  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:25.155538  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:25.155627  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:25.184750  199924 cri.go:89] found id: ""
	I1201 20:19:25.184780  199924 logs.go:282] 0 containers: []
	W1201 20:19:25.184789  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:25.184796  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:25.184858  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:25.211599  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:25.211667  199924 cri.go:89] found id: ""
	I1201 20:19:25.211680  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:25.211748  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:25.215841  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:25.215943  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:25.241047  199924 cri.go:89] found id: ""
	I1201 20:19:25.241073  199924 logs.go:282] 0 containers: []
	W1201 20:19:25.241082  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:25.241088  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:25.241145  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:25.266577  199924 cri.go:89] found id: ""
	I1201 20:19:25.266652  199924 logs.go:282] 0 containers: []
	W1201 20:19:25.266675  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:25.266705  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:25.266740  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:25.282177  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:25.282206  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:25.317364  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:25.317397  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:25.353915  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:25.353945  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:25.422917  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:25.422949  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:25.477672  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:25.477710  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:25.554564  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:25.554648  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:25.704596  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:25.704615  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:25.704627  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:25.782340  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:25.782369  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:28.332010  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:28.349962  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:28.350034  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:28.410237  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:28.410256  199924 cri.go:89] found id: ""
	I1201 20:19:28.410264  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:28.410318  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:28.414191  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:28.414278  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:28.465167  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:28.465190  199924 cri.go:89] found id: ""
	I1201 20:19:28.465199  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:28.465260  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:28.469423  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:28.469542  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:28.525193  199924 cri.go:89] found id: ""
	I1201 20:19:28.525214  199924 logs.go:282] 0 containers: []
	W1201 20:19:28.525223  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:28.525229  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:28.525290  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:28.561129  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:28.561147  199924 cri.go:89] found id: ""
	I1201 20:19:28.561154  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:28.561214  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:28.565509  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:28.565569  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:28.621921  199924 cri.go:89] found id: ""
	I1201 20:19:28.621943  199924 logs.go:282] 0 containers: []
	W1201 20:19:28.621951  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:28.621958  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:28.622019  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:28.684151  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:28.684171  199924 cri.go:89] found id: ""
	I1201 20:19:28.684179  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:28.684233  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:28.696253  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:28.696331  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:28.748621  199924 cri.go:89] found id: ""
	I1201 20:19:28.748643  199924 logs.go:282] 0 containers: []
	W1201 20:19:28.748651  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:28.748658  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:28.748714  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:28.801999  199924 cri.go:89] found id: ""
	I1201 20:19:28.802022  199924 logs.go:282] 0 containers: []
	W1201 20:19:28.802031  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:28.802045  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:28.802056  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:28.894855  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:28.894893  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:28.914794  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:28.914822  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:29.041129  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:29.041149  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:29.041164  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:29.113930  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:29.113961  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:29.168441  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:29.168473  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:29.228231  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:29.228263  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:29.282025  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:29.282063  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:29.320980  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:29.321011  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:31.884222  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:31.897573  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:31.897642  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:31.924656  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:31.924676  199924 cri.go:89] found id: ""
	I1201 20:19:31.924684  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:31.924744  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:31.928626  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:31.928717  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:31.954930  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:31.954953  199924 cri.go:89] found id: ""
	I1201 20:19:31.954961  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:31.955017  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:31.958882  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:31.958959  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:31.987975  199924 cri.go:89] found id: ""
	I1201 20:19:31.988002  199924 logs.go:282] 0 containers: []
	W1201 20:19:31.988011  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:31.988018  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:31.988081  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:32.018596  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:32.018644  199924 cri.go:89] found id: ""
	I1201 20:19:32.018653  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:32.018715  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:32.022874  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:32.023006  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:32.066660  199924 cri.go:89] found id: ""
	I1201 20:19:32.066699  199924 logs.go:282] 0 containers: []
	W1201 20:19:32.066724  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:32.066736  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:32.066813  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:32.103464  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:32.103489  199924 cri.go:89] found id: ""
	I1201 20:19:32.103505  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:32.103594  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:32.109850  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:32.109948  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:32.153008  199924 cri.go:89] found id: ""
	I1201 20:19:32.153031  199924 logs.go:282] 0 containers: []
	W1201 20:19:32.153079  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:32.153086  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:32.153165  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:32.191026  199924 cri.go:89] found id: ""
	I1201 20:19:32.191063  199924 logs.go:282] 0 containers: []
	W1201 20:19:32.191072  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:32.191107  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:32.191139  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:32.251812  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:32.251845  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:32.286049  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:32.286083  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:32.334505  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:32.334535  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:32.398956  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:32.398989  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:32.485745  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:32.485806  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:32.485846  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:32.542399  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:32.542473  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:32.580289  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:32.580359  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:32.655211  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:32.655240  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:35.169649  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:35.195220  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:35.195289  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:35.255264  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:35.255284  199924 cri.go:89] found id: ""
	I1201 20:19:35.255292  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:35.255349  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:35.260497  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:35.260567  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:35.300811  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:35.300889  199924 cri.go:89] found id: ""
	I1201 20:19:35.300910  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:35.301002  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:35.306179  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:35.306259  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:35.344757  199924 cri.go:89] found id: ""
	I1201 20:19:35.344779  199924 logs.go:282] 0 containers: []
	W1201 20:19:35.344787  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:35.344798  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:35.344857  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:35.377563  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:35.377582  199924 cri.go:89] found id: ""
	I1201 20:19:35.377590  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:35.377647  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:35.382713  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:35.382781  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:35.416093  199924 cri.go:89] found id: ""
	I1201 20:19:35.416161  199924 logs.go:282] 0 containers: []
	W1201 20:19:35.416184  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:35.416211  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:35.416289  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:35.468291  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:35.468365  199924 cri.go:89] found id: ""
	I1201 20:19:35.468387  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:35.468461  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:35.472838  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:35.472955  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:35.502770  199924 cri.go:89] found id: ""
	I1201 20:19:35.502844  199924 logs.go:282] 0 containers: []
	W1201 20:19:35.502869  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:35.502906  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:35.502993  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:35.528735  199924 cri.go:89] found id: ""
	I1201 20:19:35.528794  199924 logs.go:282] 0 containers: []
	W1201 20:19:35.528818  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:35.528843  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:35.528872  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:35.593200  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:35.593237  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:35.609329  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:35.609360  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:35.648971  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:35.649003  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:35.690484  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:35.690522  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:35.741351  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:35.741382  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:35.865662  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:35.865685  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:35.865698  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:35.921626  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:35.921663  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:35.992579  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:35.992652  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:38.552854  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:38.563788  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:38.563870  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:38.598581  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:38.598604  199924 cri.go:89] found id: ""
	I1201 20:19:38.598613  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:38.598666  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:38.602594  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:38.602665  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:38.639066  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:38.639089  199924 cri.go:89] found id: ""
	I1201 20:19:38.639098  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:38.639153  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:38.643361  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:38.643435  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:38.674645  199924 cri.go:89] found id: ""
	I1201 20:19:38.674668  199924 logs.go:282] 0 containers: []
	W1201 20:19:38.674676  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:38.674683  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:38.674738  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:38.704793  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:38.704813  199924 cri.go:89] found id: ""
	I1201 20:19:38.704821  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:38.704875  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:38.708772  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:38.708847  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:38.734750  199924 cri.go:89] found id: ""
	I1201 20:19:38.734774  199924 logs.go:282] 0 containers: []
	W1201 20:19:38.734782  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:38.734789  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:38.734848  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:38.761458  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:38.761506  199924 cri.go:89] found id: ""
	I1201 20:19:38.761514  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:38.761574  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:38.765618  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:38.765703  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:38.791362  199924 cri.go:89] found id: ""
	I1201 20:19:38.791387  199924 logs.go:282] 0 containers: []
	W1201 20:19:38.791396  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:38.791402  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:38.791458  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:38.817581  199924 cri.go:89] found id: ""
	I1201 20:19:38.817603  199924 logs.go:282] 0 containers: []
	W1201 20:19:38.817618  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:38.817632  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:38.817643  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:38.875645  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:38.875683  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:38.911148  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:38.911181  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:38.943519  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:38.943552  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:38.975247  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:38.975277  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:38.988621  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:38.988650  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:39.059646  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:39.059669  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:39.059683  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:39.105823  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:39.105860  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:39.138644  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:39.138675  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:41.673612  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:41.688331  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:41.688412  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:41.716072  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:41.716094  199924 cri.go:89] found id: ""
	I1201 20:19:41.716102  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:41.716162  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:41.720026  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:41.720099  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:41.751674  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:41.751693  199924 cri.go:89] found id: ""
	I1201 20:19:41.751701  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:41.751757  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:41.756150  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:41.756216  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:41.791746  199924 cri.go:89] found id: ""
	I1201 20:19:41.791767  199924 logs.go:282] 0 containers: []
	W1201 20:19:41.791775  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:41.791781  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:41.791838  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:41.834973  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:41.834992  199924 cri.go:89] found id: ""
	I1201 20:19:41.835000  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:41.835053  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:41.839521  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:41.839591  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:41.869636  199924 cri.go:89] found id: ""
	I1201 20:19:41.869658  199924 logs.go:282] 0 containers: []
	W1201 20:19:41.869666  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:41.869678  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:41.869736  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:41.898040  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:41.898062  199924 cri.go:89] found id: ""
	I1201 20:19:41.898071  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:41.898131  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:41.902800  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:41.902873  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:41.946052  199924 cri.go:89] found id: ""
	I1201 20:19:41.946076  199924 logs.go:282] 0 containers: []
	W1201 20:19:41.946085  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:41.946091  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:41.946161  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:41.986432  199924 cri.go:89] found id: ""
	I1201 20:19:41.986461  199924 logs.go:282] 0 containers: []
	W1201 20:19:41.986471  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:41.986484  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:41.986496  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:42.005926  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:42.005956  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:42.063944  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:42.063978  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:42.099952  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:42.100036  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:42.150834  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:42.150873  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:42.223373  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:42.223457  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:42.306046  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:42.306067  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:42.306132  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:42.356399  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:42.356431  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:42.428569  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:42.428603  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:44.971693  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:44.987865  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:44.987936  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:45.075069  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:45.075093  199924 cri.go:89] found id: ""
	I1201 20:19:45.075101  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:45.075173  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:45.092776  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:45.092856  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:45.176023  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:45.176044  199924 cri.go:89] found id: ""
	I1201 20:19:45.176053  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:45.176116  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:45.181577  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:45.181659  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:45.216669  199924 cri.go:89] found id: ""
	I1201 20:19:45.216700  199924 logs.go:282] 0 containers: []
	W1201 20:19:45.216709  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:45.216717  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:45.216791  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:45.254761  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:45.254782  199924 cri.go:89] found id: ""
	I1201 20:19:45.254790  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:45.254854  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:45.262476  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:45.262632  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:45.313535  199924 cri.go:89] found id: ""
	I1201 20:19:45.313559  199924 logs.go:282] 0 containers: []
	W1201 20:19:45.313569  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:45.313576  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:45.313644  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:45.353301  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:45.353320  199924 cri.go:89] found id: ""
	I1201 20:19:45.353328  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:45.353389  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:45.374895  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:45.375026  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:45.457305  199924 cri.go:89] found id: ""
	I1201 20:19:45.457380  199924 logs.go:282] 0 containers: []
	W1201 20:19:45.457403  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:45.457423  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:45.457584  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:45.491761  199924 cri.go:89] found id: ""
	I1201 20:19:45.491839  199924 logs.go:282] 0 containers: []
	W1201 20:19:45.491863  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:45.491905  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:45.491935  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:45.576331  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:45.576350  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:45.576363  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:45.632075  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:45.632301  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:45.678784  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:45.678868  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:45.713150  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:45.713187  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:45.776907  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:45.776945  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:45.790691  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:45.790719  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:45.830860  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:45.831027  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:45.894565  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:45.894635  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:48.437088  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:48.447527  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:48.447601  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:48.480223  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:48.480245  199924 cri.go:89] found id: ""
	I1201 20:19:48.480253  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:48.480308  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:48.484366  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:48.484435  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:48.510028  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:48.510050  199924 cri.go:89] found id: ""
	I1201 20:19:48.510058  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:48.510124  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:48.514128  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:48.514197  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:48.539220  199924 cri.go:89] found id: ""
	I1201 20:19:48.539242  199924 logs.go:282] 0 containers: []
	W1201 20:19:48.539250  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:48.539256  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:48.539346  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:48.568479  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:48.568503  199924 cri.go:89] found id: ""
	I1201 20:19:48.568512  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:48.568568  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:48.572529  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:48.572602  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:48.599984  199924 cri.go:89] found id: ""
	I1201 20:19:48.600011  199924 logs.go:282] 0 containers: []
	W1201 20:19:48.600019  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:48.600026  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:48.600086  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:48.639873  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:48.639897  199924 cri.go:89] found id: ""
	I1201 20:19:48.639908  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:48.639968  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:48.648764  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:48.648893  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:48.687397  199924 cri.go:89] found id: ""
	I1201 20:19:48.687425  199924 logs.go:282] 0 containers: []
	W1201 20:19:48.687434  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:48.687440  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:48.687505  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:48.724501  199924 cri.go:89] found id: ""
	I1201 20:19:48.724533  199924 logs.go:282] 0 containers: []
	W1201 20:19:48.724542  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:48.724558  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:48.724578  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:48.824049  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:48.824089  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:48.946079  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:48.946107  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:48.946127  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:49.013051  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:49.013099  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:49.061542  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:49.061576  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:49.106978  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:49.107024  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:49.147995  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:49.148026  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:49.207721  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:49.207761  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:49.293164  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:49.293213  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:51.836075  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:51.847106  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:51.847178  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:51.891931  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:51.891957  199924 cri.go:89] found id: ""
	I1201 20:19:51.891967  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:51.892020  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:51.896377  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:51.896455  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:51.931916  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:51.931936  199924 cri.go:89] found id: ""
	I1201 20:19:51.931944  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:51.932010  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:51.936548  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:51.936617  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:51.970845  199924 cri.go:89] found id: ""
	I1201 20:19:51.970867  199924 logs.go:282] 0 containers: []
	W1201 20:19:51.970875  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:51.970881  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:51.970938  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:51.997844  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:51.997864  199924 cri.go:89] found id: ""
	I1201 20:19:51.997872  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:51.997934  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:52.002402  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:52.002478  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:52.032333  199924 cri.go:89] found id: ""
	I1201 20:19:52.032353  199924 logs.go:282] 0 containers: []
	W1201 20:19:52.032361  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:52.032369  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:52.032448  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:52.058289  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:52.058308  199924 cri.go:89] found id: ""
	I1201 20:19:52.058372  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:52.058460  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:52.062369  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:52.062440  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:52.088761  199924 cri.go:89] found id: ""
	I1201 20:19:52.088782  199924 logs.go:282] 0 containers: []
	W1201 20:19:52.088791  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:52.088797  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:52.088856  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:52.114270  199924 cri.go:89] found id: ""
	I1201 20:19:52.114293  199924 logs.go:282] 0 containers: []
	W1201 20:19:52.114302  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:52.114315  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:52.114341  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:52.147625  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:52.147660  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:52.186606  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:52.186645  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:52.220840  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:52.220877  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:52.249929  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:52.249963  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:52.307843  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:52.307876  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:52.321044  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:52.321070  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:52.412355  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:52.412429  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:52.412450  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:52.452978  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:52.453011  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:54.986961  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:54.997980  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:54.998048  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:55.045603  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:55.045679  199924 cri.go:89] found id: ""
	I1201 20:19:55.045710  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:55.045808  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:55.050960  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:55.051036  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:55.079107  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:55.079128  199924 cri.go:89] found id: ""
	I1201 20:19:55.079136  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:55.079193  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:55.083470  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:55.083551  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:55.110133  199924 cri.go:89] found id: ""
	I1201 20:19:55.110158  199924 logs.go:282] 0 containers: []
	W1201 20:19:55.110167  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:55.110174  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:55.110238  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:55.138236  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:55.138258  199924 cri.go:89] found id: ""
	I1201 20:19:55.138267  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:55.138326  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:55.142528  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:55.142610  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:55.168678  199924 cri.go:89] found id: ""
	I1201 20:19:55.168705  199924 logs.go:282] 0 containers: []
	W1201 20:19:55.168715  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:55.168722  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:55.168782  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:55.195380  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:55.195445  199924 cri.go:89] found id: ""
	I1201 20:19:55.195461  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:55.195530  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:55.199884  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:55.199985  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:55.230283  199924 cri.go:89] found id: ""
	I1201 20:19:55.230309  199924 logs.go:282] 0 containers: []
	W1201 20:19:55.230319  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:55.230326  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:55.230409  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:55.255432  199924 cri.go:89] found id: ""
	I1201 20:19:55.255459  199924 logs.go:282] 0 containers: []
	W1201 20:19:55.255468  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:55.255504  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:55.255519  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:55.315366  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:55.315401  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:55.329301  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:55.329330  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:55.412100  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:55.412120  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:55.412136  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:55.450897  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:55.450933  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:55.486469  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:55.486505  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:55.519256  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:55.519291  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:55.552024  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:55.552063  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:55.585108  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:55.585143  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:58.114504  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:19:58.125305  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:19:58.125377  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:19:58.152344  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:58.152368  199924 cri.go:89] found id: ""
	I1201 20:19:58.152386  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:19:58.152442  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:58.156512  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:19:58.156585  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:19:58.186433  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:58.186459  199924 cri.go:89] found id: ""
	I1201 20:19:58.186468  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:19:58.186523  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:58.190513  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:19:58.190586  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:19:58.218257  199924 cri.go:89] found id: ""
	I1201 20:19:58.218284  199924 logs.go:282] 0 containers: []
	W1201 20:19:58.218293  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:19:58.218300  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:19:58.218384  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:19:58.244520  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:19:58.244543  199924 cri.go:89] found id: ""
	I1201 20:19:58.244551  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:19:58.244625  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:58.248512  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:19:58.248640  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:19:58.276450  199924 cri.go:89] found id: ""
	I1201 20:19:58.276473  199924 logs.go:282] 0 containers: []
	W1201 20:19:58.276482  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:19:58.276507  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:19:58.276597  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:19:58.301807  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:58.301839  199924 cri.go:89] found id: ""
	I1201 20:19:58.301849  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:19:58.301913  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:19:58.307721  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:19:58.307849  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:19:58.332720  199924 cri.go:89] found id: ""
	I1201 20:19:58.332752  199924 logs.go:282] 0 containers: []
	W1201 20:19:58.332760  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:19:58.332767  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:19:58.332833  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:19:58.367334  199924 cri.go:89] found id: ""
	I1201 20:19:58.367361  199924 logs.go:282] 0 containers: []
	W1201 20:19:58.367371  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:19:58.367402  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:19:58.367419  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:19:58.420262  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:19:58.420295  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:19:58.453241  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:19:58.453274  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:19:58.491680  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:19:58.491709  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:19:58.526445  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:19:58.526480  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:19:58.563476  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:19:58.563506  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:19:58.624216  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:19:58.624249  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:19:58.637727  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:19:58.637755  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:19:58.706615  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:19:58.706637  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:19:58.706650  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:01.243786  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:01.254938  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:01.255009  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:01.282592  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:01.282621  199924 cri.go:89] found id: ""
	I1201 20:20:01.282630  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:01.282714  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:01.287346  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:01.287432  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:01.315220  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:01.315240  199924 cri.go:89] found id: ""
	I1201 20:20:01.315249  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:01.315305  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:01.319488  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:01.319563  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:01.351877  199924 cri.go:89] found id: ""
	I1201 20:20:01.351902  199924 logs.go:282] 0 containers: []
	W1201 20:20:01.351912  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:01.351918  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:01.351987  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:01.387475  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:01.387498  199924 cri.go:89] found id: ""
	I1201 20:20:01.387507  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:01.387564  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:01.394913  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:01.394994  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:01.423536  199924 cri.go:89] found id: ""
	I1201 20:20:01.423562  199924 logs.go:282] 0 containers: []
	W1201 20:20:01.423572  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:01.423578  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:01.423679  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:01.450160  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:01.450193  199924 cri.go:89] found id: ""
	I1201 20:20:01.450209  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:01.450274  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:01.454261  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:01.454364  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:01.480055  199924 cri.go:89] found id: ""
	I1201 20:20:01.480081  199924 logs.go:282] 0 containers: []
	W1201 20:20:01.480089  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:01.480099  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:01.480177  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:01.512862  199924 cri.go:89] found id: ""
	I1201 20:20:01.512888  199924 logs.go:282] 0 containers: []
	W1201 20:20:01.512897  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:01.512911  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:01.512942  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:01.543173  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:01.543202  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:01.601282  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:01.601317  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:01.640066  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:01.640098  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:01.678264  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:01.678295  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:01.691601  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:01.691631  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:01.762305  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:01.762328  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:01.762343  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:01.795240  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:01.795270  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:01.833088  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:01.833119  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:04.366695  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:04.380308  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:04.380377  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:04.409536  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:04.409557  199924 cri.go:89] found id: ""
	I1201 20:20:04.409564  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:04.409624  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:04.413548  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:04.413622  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:04.440370  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:04.440390  199924 cri.go:89] found id: ""
	I1201 20:20:04.440398  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:04.440453  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:04.444405  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:04.444477  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:04.470406  199924 cri.go:89] found id: ""
	I1201 20:20:04.470429  199924 logs.go:282] 0 containers: []
	W1201 20:20:04.470437  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:04.470443  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:04.470503  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:04.496279  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:04.496298  199924 cri.go:89] found id: ""
	I1201 20:20:04.496306  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:04.496360  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:04.500151  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:04.500219  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:04.525737  199924 cri.go:89] found id: ""
	I1201 20:20:04.525759  199924 logs.go:282] 0 containers: []
	W1201 20:20:04.525767  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:04.525774  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:04.525832  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:04.551598  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:04.551625  199924 cri.go:89] found id: ""
	I1201 20:20:04.551634  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:04.551698  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:04.555636  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:04.555712  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:04.581480  199924 cri.go:89] found id: ""
	I1201 20:20:04.581532  199924 logs.go:282] 0 containers: []
	W1201 20:20:04.581540  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:04.581547  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:04.581624  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:04.606318  199924 cri.go:89] found id: ""
	I1201 20:20:04.606344  199924 logs.go:282] 0 containers: []
	W1201 20:20:04.606353  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:04.606366  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:04.606381  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:04.665314  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:04.665350  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:04.678468  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:04.678500  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:04.747218  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:04.747241  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:04.747255  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:04.785860  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:04.785893  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:04.819331  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:04.819365  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:04.860918  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:04.860948  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:04.895019  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:04.895049  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:04.924617  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:04.924646  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:07.457600  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:07.468206  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:07.468278  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:07.493987  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:07.494013  199924 cri.go:89] found id: ""
	I1201 20:20:07.494022  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:07.494079  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:07.498054  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:07.498140  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:07.523690  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:07.523713  199924 cri.go:89] found id: ""
	I1201 20:20:07.523722  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:07.523781  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:07.527678  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:07.527751  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:07.566948  199924 cri.go:89] found id: ""
	I1201 20:20:07.566984  199924 logs.go:282] 0 containers: []
	W1201 20:20:07.566993  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:07.566999  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:07.567060  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:07.591862  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:07.591884  199924 cri.go:89] found id: ""
	I1201 20:20:07.591892  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:07.591945  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:07.595736  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:07.595807  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:07.620993  199924 cri.go:89] found id: ""
	I1201 20:20:07.621014  199924 logs.go:282] 0 containers: []
	W1201 20:20:07.621023  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:07.621029  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:07.621085  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:07.648456  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:07.648475  199924 cri.go:89] found id: ""
	I1201 20:20:07.648482  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:07.648535  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:07.652588  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:07.652658  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:07.678915  199924 cri.go:89] found id: ""
	I1201 20:20:07.678938  199924 logs.go:282] 0 containers: []
	W1201 20:20:07.678946  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:07.678952  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:07.679011  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:07.704458  199924 cri.go:89] found id: ""
	I1201 20:20:07.704536  199924 logs.go:282] 0 containers: []
	W1201 20:20:07.704560  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:07.704602  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:07.704633  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:07.763078  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:07.763112  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:07.800284  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:07.800315  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:07.830008  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:07.830042  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:07.843193  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:07.843222  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:07.913573  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:07.913595  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:07.913608  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:07.949553  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:07.949588  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:07.988458  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:07.988488  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:08.030791  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:08.030822  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:10.565539  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:10.576443  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:10.576515  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:10.602210  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:10.602231  199924 cri.go:89] found id: ""
	I1201 20:20:10.602239  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:10.602296  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:10.606287  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:10.606407  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:10.634512  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:10.634537  199924 cri.go:89] found id: ""
	I1201 20:20:10.634545  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:10.634599  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:10.638529  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:10.638604  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:10.663869  199924 cri.go:89] found id: ""
	I1201 20:20:10.663894  199924 logs.go:282] 0 containers: []
	W1201 20:20:10.663903  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:10.663910  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:10.663966  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:10.693048  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:10.693072  199924 cri.go:89] found id: ""
	I1201 20:20:10.693080  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:10.693136  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:10.697025  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:10.697104  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:10.721345  199924 cri.go:89] found id: ""
	I1201 20:20:10.721371  199924 logs.go:282] 0 containers: []
	W1201 20:20:10.721380  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:10.721386  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:10.721445  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:10.746730  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:10.746752  199924 cri.go:89] found id: ""
	I1201 20:20:10.746760  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:10.746834  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:10.750902  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:10.751009  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:10.775932  199924 cri.go:89] found id: ""
	I1201 20:20:10.775956  199924 logs.go:282] 0 containers: []
	W1201 20:20:10.775965  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:10.775973  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:10.776028  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:10.804994  199924 cri.go:89] found id: ""
	I1201 20:20:10.805066  199924 logs.go:282] 0 containers: []
	W1201 20:20:10.805089  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:10.805136  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:10.805169  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:10.863486  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:10.863518  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:10.905419  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:10.905452  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:10.938940  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:10.938973  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:10.972510  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:10.972540  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:11.006525  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:11.006562  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:11.049250  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:11.049322  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:11.062763  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:11.062795  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:11.159585  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:11.159660  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:11.159695  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:13.694941  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:13.705926  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:13.706004  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:13.731789  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:13.731812  199924 cri.go:89] found id: ""
	I1201 20:20:13.731820  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:13.731880  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:13.736051  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:13.736126  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:13.763336  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:13.763361  199924 cri.go:89] found id: ""
	I1201 20:20:13.763370  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:13.763426  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:13.767708  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:13.767784  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:13.798568  199924 cri.go:89] found id: ""
	I1201 20:20:13.798595  199924 logs.go:282] 0 containers: []
	W1201 20:20:13.798605  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:13.798611  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:13.798668  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:13.823817  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:13.823851  199924 cri.go:89] found id: ""
	I1201 20:20:13.823860  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:13.823937  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:13.828050  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:13.828129  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:13.857363  199924 cri.go:89] found id: ""
	I1201 20:20:13.857390  199924 logs.go:282] 0 containers: []
	W1201 20:20:13.857399  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:13.857406  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:13.857471  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:13.882680  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:13.882703  199924 cri.go:89] found id: ""
	I1201 20:20:13.882711  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:13.882768  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:13.886689  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:13.886761  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:13.911456  199924 cri.go:89] found id: ""
	I1201 20:20:13.911483  199924 logs.go:282] 0 containers: []
	W1201 20:20:13.911503  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:13.911527  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:13.911608  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:13.937663  199924 cri.go:89] found id: ""
	I1201 20:20:13.937691  199924 logs.go:282] 0 containers: []
	W1201 20:20:13.937700  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:13.937713  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:13.937723  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:13.997163  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:13.997205  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:14.015926  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:14.015965  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:14.083588  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:14.083611  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:14.083624  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:14.125834  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:14.125870  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:14.162416  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:14.162449  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:14.197462  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:14.197507  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:14.235163  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:14.235192  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:14.269461  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:14.269503  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:16.801646  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:16.812436  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:16.812534  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:16.840760  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:16.840780  199924 cri.go:89] found id: ""
	I1201 20:20:16.840788  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:16.840842  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:16.844567  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:16.844647  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:16.870284  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:16.870308  199924 cri.go:89] found id: ""
	I1201 20:20:16.870317  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:16.870373  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:16.874593  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:16.874668  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:16.900131  199924 cri.go:89] found id: ""
	I1201 20:20:16.900156  199924 logs.go:282] 0 containers: []
	W1201 20:20:16.900165  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:16.900171  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:16.900232  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:16.927594  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:16.927623  199924 cri.go:89] found id: ""
	I1201 20:20:16.927633  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:16.927688  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:16.931859  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:16.931939  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:16.956627  199924 cri.go:89] found id: ""
	I1201 20:20:16.956653  199924 logs.go:282] 0 containers: []
	W1201 20:20:16.956663  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:16.956669  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:16.956728  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:16.982838  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:16.982862  199924 cri.go:89] found id: ""
	I1201 20:20:16.982871  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:16.982948  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:16.987020  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:16.987102  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:17.015109  199924 cri.go:89] found id: ""
	I1201 20:20:17.015136  199924 logs.go:282] 0 containers: []
	W1201 20:20:17.015146  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:17.015152  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:17.015239  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:17.048515  199924 cri.go:89] found id: ""
	I1201 20:20:17.048549  199924 logs.go:282] 0 containers: []
	W1201 20:20:17.048559  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:17.048591  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:17.048635  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:17.131381  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:17.131404  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:17.131417  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:17.179222  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:17.179256  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:17.211408  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:17.211438  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:17.224288  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:17.224324  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:17.263065  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:17.263099  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:17.296980  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:17.297012  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:17.329633  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:17.329665  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:17.357615  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:17.357649  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:19.917617  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:19.928314  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:19.928382  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:19.961619  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:19.961644  199924 cri.go:89] found id: ""
	I1201 20:20:19.961652  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:19.961717  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:19.965855  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:19.965927  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:19.996879  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:19.996913  199924 cri.go:89] found id: ""
	I1201 20:20:19.996924  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:19.996992  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:20.000814  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:20.000888  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:20.044717  199924 cri.go:89] found id: ""
	I1201 20:20:20.044744  199924 logs.go:282] 0 containers: []
	W1201 20:20:20.044753  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:20.044760  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:20.044870  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:20.071153  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:20.071221  199924 cri.go:89] found id: ""
	I1201 20:20:20.071244  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:20.071329  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:20.075223  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:20.075315  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:20.122972  199924 cri.go:89] found id: ""
	I1201 20:20:20.123009  199924 logs.go:282] 0 containers: []
	W1201 20:20:20.123024  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:20.123030  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:20.123126  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:20.157592  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:20.157662  199924 cri.go:89] found id: ""
	I1201 20:20:20.157678  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:20.157735  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:20.162270  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:20.162351  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:20.188799  199924 cri.go:89] found id: ""
	I1201 20:20:20.188830  199924 logs.go:282] 0 containers: []
	W1201 20:20:20.188838  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:20.188845  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:20.188915  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:20.214846  199924 cri.go:89] found id: ""
	I1201 20:20:20.214913  199924 logs.go:282] 0 containers: []
	W1201 20:20:20.214935  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:20.214957  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:20.214969  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:20.248175  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:20.248204  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:20.281076  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:20.281110  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:20.321425  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:20.321458  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:20.379022  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:20.379058  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:20.446474  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:20.446499  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:20.446525  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:20.482531  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:20.482562  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:20.521181  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:20.521211  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:20.534661  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:20.534690  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:23.068960  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:23.079540  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:23.079655  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:23.115525  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:23.115545  199924 cri.go:89] found id: ""
	I1201 20:20:23.115558  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:23.115614  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:23.120349  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:23.120473  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:23.155931  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:23.155965  199924 cri.go:89] found id: ""
	I1201 20:20:23.155973  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:23.156050  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:23.160696  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:23.160767  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:23.185575  199924 cri.go:89] found id: ""
	I1201 20:20:23.185603  199924 logs.go:282] 0 containers: []
	W1201 20:20:23.185612  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:23.185618  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:23.185676  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:23.212350  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:23.212420  199924 cri.go:89] found id: ""
	I1201 20:20:23.212442  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:23.212541  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:23.216460  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:23.216534  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:23.246102  199924 cri.go:89] found id: ""
	I1201 20:20:23.246132  199924 logs.go:282] 0 containers: []
	W1201 20:20:23.246141  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:23.246148  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:23.246207  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:23.284441  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:23.284463  199924 cri.go:89] found id: ""
	I1201 20:20:23.284471  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:23.284528  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:23.288679  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:23.288748  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:23.315452  199924 cri.go:89] found id: ""
	I1201 20:20:23.315475  199924 logs.go:282] 0 containers: []
	W1201 20:20:23.315484  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:23.315491  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:23.315547  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:23.340984  199924 cri.go:89] found id: ""
	I1201 20:20:23.341018  199924 logs.go:282] 0 containers: []
	W1201 20:20:23.341027  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:23.341041  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:23.341056  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:23.369203  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:23.369238  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:23.382042  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:23.382069  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:23.413603  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:23.413639  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:23.455088  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:23.455119  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:23.518789  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:23.518833  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:23.587812  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:23.587901  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:23.587926  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:23.632664  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:23.632696  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:23.665959  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:23.665992  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:26.199815  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:26.210670  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:26.210740  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:26.236204  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:26.236227  199924 cri.go:89] found id: ""
	I1201 20:20:26.236235  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:26.236290  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:26.240192  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:26.240264  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:26.266996  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:26.267021  199924 cri.go:89] found id: ""
	I1201 20:20:26.267030  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:26.267085  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:26.271089  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:26.271206  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:26.298916  199924 cri.go:89] found id: ""
	I1201 20:20:26.298941  199924 logs.go:282] 0 containers: []
	W1201 20:20:26.298950  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:26.298956  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:26.299015  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:26.325890  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:26.325915  199924 cri.go:89] found id: ""
	I1201 20:20:26.325924  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:26.325985  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:26.330195  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:26.330272  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:26.356370  199924 cri.go:89] found id: ""
	I1201 20:20:26.356447  199924 logs.go:282] 0 containers: []
	W1201 20:20:26.356469  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:26.356488  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:26.356581  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:26.381676  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:26.381747  199924 cri.go:89] found id: ""
	I1201 20:20:26.381769  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:26.381863  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:26.386004  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:26.386084  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:26.411340  199924 cri.go:89] found id: ""
	I1201 20:20:26.411363  199924 logs.go:282] 0 containers: []
	W1201 20:20:26.411375  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:26.411381  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:26.411437  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:26.435864  199924 cri.go:89] found id: ""
	I1201 20:20:26.435890  199924 logs.go:282] 0 containers: []
	W1201 20:20:26.435901  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:26.435916  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:26.435927  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:26.468075  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:26.468107  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:26.499618  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:26.499648  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:26.532189  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:26.532227  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:26.560461  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:26.560489  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:26.619706  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:26.619741  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:26.692468  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:26.692532  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:26.692551  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:26.725952  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:26.725988  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:26.759786  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:26.759817  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:29.275171  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:29.286039  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:29.286111  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:29.312231  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:29.312252  199924 cri.go:89] found id: ""
	I1201 20:20:29.312261  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:29.312313  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:29.316182  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:29.316260  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:29.341292  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:29.341317  199924 cri.go:89] found id: ""
	I1201 20:20:29.341325  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:29.341381  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:29.345112  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:29.345184  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:29.370828  199924 cri.go:89] found id: ""
	I1201 20:20:29.370854  199924 logs.go:282] 0 containers: []
	W1201 20:20:29.370863  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:29.370869  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:29.370925  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:29.396244  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:29.396266  199924 cri.go:89] found id: ""
	I1201 20:20:29.396275  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:29.396374  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:29.400475  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:29.400556  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:29.432477  199924 cri.go:89] found id: ""
	I1201 20:20:29.432511  199924 logs.go:282] 0 containers: []
	W1201 20:20:29.432520  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:29.432528  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:29.432595  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:29.458092  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:29.458130  199924 cri.go:89] found id: ""
	I1201 20:20:29.458140  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:29.458210  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:29.462159  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:29.462229  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:29.492243  199924 cri.go:89] found id: ""
	I1201 20:20:29.492266  199924 logs.go:282] 0 containers: []
	W1201 20:20:29.492275  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:29.492281  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:29.492345  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:29.519063  199924 cri.go:89] found id: ""
	I1201 20:20:29.519087  199924 logs.go:282] 0 containers: []
	W1201 20:20:29.519095  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:29.519109  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:29.519121  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:29.532433  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:29.532462  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:29.598544  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:29.598566  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:29.598583  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:29.632394  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:29.632423  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:29.663058  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:29.663088  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:29.695214  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:29.695249  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:29.754241  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:29.754276  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:29.791878  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:29.791913  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:29.837351  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:29.837427  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:32.380465  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:32.394445  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:32.394531  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:32.444071  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:32.444096  199924 cri.go:89] found id: ""
	I1201 20:20:32.444105  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:32.444160  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:32.448431  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:32.448506  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:32.484313  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:32.484387  199924 cri.go:89] found id: ""
	I1201 20:20:32.484411  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:32.484494  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:32.488797  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:32.488866  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:32.515727  199924 cri.go:89] found id: ""
	I1201 20:20:32.515748  199924 logs.go:282] 0 containers: []
	W1201 20:20:32.515760  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:32.515769  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:32.515830  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:32.551110  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:32.551129  199924 cri.go:89] found id: ""
	I1201 20:20:32.551137  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:32.551190  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:32.555585  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:32.555695  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:32.590615  199924 cri.go:89] found id: ""
	I1201 20:20:32.590680  199924 logs.go:282] 0 containers: []
	W1201 20:20:32.590703  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:32.590724  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:32.590821  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:32.619907  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:32.619980  199924 cri.go:89] found id: ""
	I1201 20:20:32.620002  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:32.620088  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:32.624364  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:32.624474  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:32.650980  199924 cri.go:89] found id: ""
	I1201 20:20:32.651056  199924 logs.go:282] 0 containers: []
	W1201 20:20:32.651079  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:32.651099  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:32.651196  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:32.686648  199924 cri.go:89] found id: ""
	I1201 20:20:32.686715  199924 logs.go:282] 0 containers: []
	W1201 20:20:32.686738  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:32.686767  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:32.686806  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:32.754499  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:32.754573  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:32.768498  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:32.768565  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:32.801686  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:32.801762  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:32.846774  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:32.846844  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:32.975914  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:32.975979  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:32.976013  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:33.032861  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:33.032936  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:33.083039  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:33.083116  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:33.118966  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:33.119037  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:35.654499  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:35.665420  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:35.665518  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:35.690801  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:35.690825  199924 cri.go:89] found id: ""
	I1201 20:20:35.690833  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:35.690894  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:35.694841  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:35.694915  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:35.719902  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:35.719924  199924 cri.go:89] found id: ""
	I1201 20:20:35.719932  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:35.719990  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:35.723989  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:35.724067  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:35.750150  199924 cri.go:89] found id: ""
	I1201 20:20:35.750177  199924 logs.go:282] 0 containers: []
	W1201 20:20:35.750186  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:35.750193  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:35.750279  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:35.775338  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:35.775358  199924 cri.go:89] found id: ""
	I1201 20:20:35.775367  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:35.775446  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:35.779317  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:35.779389  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:35.809932  199924 cri.go:89] found id: ""
	I1201 20:20:35.809956  199924 logs.go:282] 0 containers: []
	W1201 20:20:35.809965  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:35.809971  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:35.810028  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:35.836008  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:35.836031  199924 cri.go:89] found id: ""
	I1201 20:20:35.836039  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:35.836090  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:35.840556  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:35.840626  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:35.873201  199924 cri.go:89] found id: ""
	I1201 20:20:35.873226  199924 logs.go:282] 0 containers: []
	W1201 20:20:35.873234  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:35.873241  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:35.873297  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:35.908481  199924 cri.go:89] found id: ""
	I1201 20:20:35.908507  199924 logs.go:282] 0 containers: []
	W1201 20:20:35.908516  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:35.908532  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:35.908543  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:35.941894  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:35.941929  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:35.970691  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:35.970764  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:36.029077  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:36.029116  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:36.042818  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:36.042847  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:36.079191  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:36.079222  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:36.146040  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:36.146103  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:36.146162  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:36.183075  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:36.183108  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:36.216438  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:36.216472  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:38.751001  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:38.762524  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:38.762593  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:38.789412  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:38.789436  199924 cri.go:89] found id: ""
	I1201 20:20:38.789445  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:38.789539  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:38.793361  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:38.793436  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:38.819624  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:38.819687  199924 cri.go:89] found id: ""
	I1201 20:20:38.819701  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:38.819759  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:38.823774  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:38.823865  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:38.856854  199924 cri.go:89] found id: ""
	I1201 20:20:38.856882  199924 logs.go:282] 0 containers: []
	W1201 20:20:38.856891  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:38.856898  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:38.856955  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:38.887849  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:38.887874  199924 cri.go:89] found id: ""
	I1201 20:20:38.887882  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:38.887937  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:38.892351  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:38.892423  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:38.920350  199924 cri.go:89] found id: ""
	I1201 20:20:38.920379  199924 logs.go:282] 0 containers: []
	W1201 20:20:38.920387  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:38.920395  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:38.920450  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:38.950040  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:38.950063  199924 cri.go:89] found id: ""
	I1201 20:20:38.950072  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:38.950172  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:38.954408  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:38.954490  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:38.980057  199924 cri.go:89] found id: ""
	I1201 20:20:38.980083  199924 logs.go:282] 0 containers: []
	W1201 20:20:38.980092  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:38.980101  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:38.980159  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:39.006589  199924 cri.go:89] found id: ""
	I1201 20:20:39.006613  199924 logs.go:282] 0 containers: []
	W1201 20:20:39.006622  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:39.006636  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:39.006649  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:39.046639  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:39.046676  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:39.082491  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:39.082524  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:39.144469  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:39.144507  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:39.177004  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:39.177035  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:39.211089  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:39.211120  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:39.240008  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:39.240036  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:39.253271  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:39.253302  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:39.325707  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:39.325727  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:39.325741  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:41.858711  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:41.871034  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:41.871098  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:41.905783  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:41.905807  199924 cri.go:89] found id: ""
	I1201 20:20:41.905816  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:41.905872  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:41.909876  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:41.909952  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:41.936177  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:41.936200  199924 cri.go:89] found id: ""
	I1201 20:20:41.936208  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:41.936274  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:41.940274  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:41.940353  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:41.972251  199924 cri.go:89] found id: ""
	I1201 20:20:41.972277  199924 logs.go:282] 0 containers: []
	W1201 20:20:41.972285  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:41.972292  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:41.972361  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:41.998681  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:41.998704  199924 cri.go:89] found id: ""
	I1201 20:20:41.998712  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:41.998778  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:42.002975  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:42.003050  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:42.032594  199924 cri.go:89] found id: ""
	I1201 20:20:42.032621  199924 logs.go:282] 0 containers: []
	W1201 20:20:42.032631  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:42.032639  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:42.032707  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:42.063366  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:42.063391  199924 cri.go:89] found id: ""
	I1201 20:20:42.063401  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:42.063479  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:42.068004  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:42.068087  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:42.098113  199924 cri.go:89] found id: ""
	I1201 20:20:42.098209  199924 logs.go:282] 0 containers: []
	W1201 20:20:42.098234  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:42.098272  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:42.098364  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:42.131316  199924 cri.go:89] found id: ""
	I1201 20:20:42.131342  199924 logs.go:282] 0 containers: []
	W1201 20:20:42.131366  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:42.131382  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:42.131401  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:42.171991  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:42.172079  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:42.222956  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:42.223006  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:42.259085  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:42.259120  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:42.274046  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:42.274075  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:42.311504  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:42.311538  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:42.345493  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:42.345561  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:42.378443  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:42.378469  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:42.438394  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:42.438429  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:42.509112  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:45.009740  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:45.037343  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:45.037427  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:45.091962  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:45.092898  199924 cri.go:89] found id: ""
	I1201 20:20:45.092966  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:45.106330  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:45.112378  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:45.112458  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:45.162031  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:45.162055  199924 cri.go:89] found id: ""
	I1201 20:20:45.162064  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:45.163520  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:45.169459  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:45.169637  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:45.240867  199924 cri.go:89] found id: ""
	I1201 20:20:45.240951  199924 logs.go:282] 0 containers: []
	W1201 20:20:45.240983  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:45.241004  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:45.241102  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:45.287985  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:45.288007  199924 cri.go:89] found id: ""
	I1201 20:20:45.288018  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:45.288092  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:45.295411  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:45.295501  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:45.336075  199924 cri.go:89] found id: ""
	I1201 20:20:45.336170  199924 logs.go:282] 0 containers: []
	W1201 20:20:45.336187  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:45.336196  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:45.336348  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:45.365565  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:45.365601  199924 cri.go:89] found id: ""
	I1201 20:20:45.365620  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:45.365678  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:45.369883  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:45.369954  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:45.398653  199924 cri.go:89] found id: ""
	I1201 20:20:45.398677  199924 logs.go:282] 0 containers: []
	W1201 20:20:45.398686  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:45.398694  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:45.398755  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:45.427641  199924 cri.go:89] found id: ""
	I1201 20:20:45.427715  199924 logs.go:282] 0 containers: []
	W1201 20:20:45.427729  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:45.427747  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:45.427758  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:45.487361  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:45.487397  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:45.503634  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:45.503660  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:45.579429  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:45.579494  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:45.579514  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:45.617340  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:45.617373  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:45.667786  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:45.667815  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:45.707470  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:45.707515  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:45.738161  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:45.738191  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:45.775732  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:45.775763  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:48.311868  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:48.323121  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:48.323191  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:48.348612  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:48.348635  199924 cri.go:89] found id: ""
	I1201 20:20:48.348643  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:48.348699  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:48.352688  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:48.352837  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:48.384678  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:48.384700  199924 cri.go:89] found id: ""
	I1201 20:20:48.384712  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:48.384766  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:48.388791  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:48.388861  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:48.415330  199924 cri.go:89] found id: ""
	I1201 20:20:48.415354  199924 logs.go:282] 0 containers: []
	W1201 20:20:48.415363  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:48.415370  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:48.415427  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:48.444804  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:48.444824  199924 cri.go:89] found id: ""
	I1201 20:20:48.444833  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:48.444893  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:48.448825  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:48.448892  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:48.481158  199924 cri.go:89] found id: ""
	I1201 20:20:48.481238  199924 logs.go:282] 0 containers: []
	W1201 20:20:48.481261  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:48.481280  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:48.481374  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:48.508572  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:48.508592  199924 cri.go:89] found id: ""
	I1201 20:20:48.508600  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:48.508655  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:48.512827  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:48.512901  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:48.540075  199924 cri.go:89] found id: ""
	I1201 20:20:48.540103  199924 logs.go:282] 0 containers: []
	W1201 20:20:48.540115  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:48.540123  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:48.540183  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:48.567956  199924 cri.go:89] found id: ""
	I1201 20:20:48.567983  199924 logs.go:282] 0 containers: []
	W1201 20:20:48.567992  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:48.568008  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:48.568021  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:48.618242  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:48.618278  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:48.661426  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:48.661463  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:48.697727  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:48.697765  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:48.727116  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:48.727147  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:48.741828  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:48.741861  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:48.778266  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:48.778299  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:48.810518  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:48.810546  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:48.868734  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:48.868765  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:48.932478  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:51.432757  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:51.444318  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:51.444389  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:51.470873  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:51.470893  199924 cri.go:89] found id: ""
	I1201 20:20:51.470901  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:51.470959  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:51.475015  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:51.475085  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:51.507320  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:51.507344  199924 cri.go:89] found id: ""
	I1201 20:20:51.507353  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:51.507408  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:51.511487  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:51.511562  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:51.536818  199924 cri.go:89] found id: ""
	I1201 20:20:51.536840  199924 logs.go:282] 0 containers: []
	W1201 20:20:51.536849  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:51.536855  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:51.536918  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:51.564540  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:51.564565  199924 cri.go:89] found id: ""
	I1201 20:20:51.564574  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:51.564629  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:51.568614  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:51.568687  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:51.598386  199924 cri.go:89] found id: ""
	I1201 20:20:51.598412  199924 logs.go:282] 0 containers: []
	W1201 20:20:51.598422  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:51.598429  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:51.598488  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:51.634399  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:51.634422  199924 cri.go:89] found id: ""
	I1201 20:20:51.634431  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:51.634489  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:51.639608  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:51.639688  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:51.669339  199924 cri.go:89] found id: ""
	I1201 20:20:51.669415  199924 logs.go:282] 0 containers: []
	W1201 20:20:51.669439  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:51.669454  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:51.669557  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:51.695041  199924 cri.go:89] found id: ""
	I1201 20:20:51.695065  199924 logs.go:282] 0 containers: []
	W1201 20:20:51.695073  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:51.695088  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:51.695105  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:51.753423  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:51.753458  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:51.790282  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:51.790318  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:51.823217  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:51.823249  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:51.856644  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:51.856674  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:51.891072  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:51.891118  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:51.923685  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:51.923716  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:51.937873  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:51.937902  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:52.005295  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:52.005316  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:52.005329  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:54.542351  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:54.554031  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:54.554098  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:54.592171  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:54.592191  199924 cri.go:89] found id: ""
	I1201 20:20:54.592199  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:54.592257  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:54.596610  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:54.596682  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:54.632921  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:54.632995  199924 cri.go:89] found id: ""
	I1201 20:20:54.633018  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:54.633091  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:54.638628  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:54.638702  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:54.668910  199924 cri.go:89] found id: ""
	I1201 20:20:54.668932  199924 logs.go:282] 0 containers: []
	W1201 20:20:54.668941  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:54.668947  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:54.669005  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:54.695200  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:54.695221  199924 cri.go:89] found id: ""
	I1201 20:20:54.695229  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:54.695342  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:54.699358  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:54.699427  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:54.725201  199924 cri.go:89] found id: ""
	I1201 20:20:54.725224  199924 logs.go:282] 0 containers: []
	W1201 20:20:54.725232  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:54.725239  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:54.725300  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:54.757030  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:54.757057  199924 cri.go:89] found id: ""
	I1201 20:20:54.757065  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:54.757121  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:54.761533  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:54.761608  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:54.787877  199924 cri.go:89] found id: ""
	I1201 20:20:54.787902  199924 logs.go:282] 0 containers: []
	W1201 20:20:54.787910  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:54.787917  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:54.787976  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:54.813150  199924 cri.go:89] found id: ""
	I1201 20:20:54.813172  199924 logs.go:282] 0 containers: []
	W1201 20:20:54.813181  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:54.813198  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:54.813215  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:54.846376  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:54.846412  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:20:54.882537  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:54.882570  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:54.914121  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:54.914165  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:54.946294  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:54.946330  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:54.978493  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:54.978526  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:55.049456  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:55.049513  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:55.067197  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:55.067231  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:55.163428  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:55.163457  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:55.163476  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:57.707669  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:20:57.718346  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:20:57.718420  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:20:57.743575  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:57.743602  199924 cri.go:89] found id: ""
	I1201 20:20:57.743610  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:20:57.743670  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:57.747702  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:20:57.747776  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:20:57.774194  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:57.774217  199924 cri.go:89] found id: ""
	I1201 20:20:57.774226  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:20:57.774280  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:57.778270  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:20:57.778342  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:20:57.808359  199924 cri.go:89] found id: ""
	I1201 20:20:57.808384  199924 logs.go:282] 0 containers: []
	W1201 20:20:57.808394  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:20:57.808400  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:20:57.808457  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:20:57.838111  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:57.838141  199924 cri.go:89] found id: ""
	I1201 20:20:57.838150  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:20:57.838206  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:57.842104  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:20:57.842200  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:20:57.867623  199924 cri.go:89] found id: ""
	I1201 20:20:57.867647  199924 logs.go:282] 0 containers: []
	W1201 20:20:57.867659  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:20:57.867666  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:20:57.867721  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:20:57.893729  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:57.893750  199924 cri.go:89] found id: ""
	I1201 20:20:57.893758  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:20:57.893814  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:20:57.897902  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:20:57.898014  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:20:57.923020  199924 cri.go:89] found id: ""
	I1201 20:20:57.923043  199924 logs.go:282] 0 containers: []
	W1201 20:20:57.923051  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:20:57.923058  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:20:57.923117  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:20:57.949261  199924 cri.go:89] found id: ""
	I1201 20:20:57.949324  199924 logs.go:282] 0 containers: []
	W1201 20:20:57.949347  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:20:57.949374  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:20:57.949400  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:20:58.007668  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:20:58.007705  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:20:58.089149  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:20:58.089171  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:20:58.089183  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:20:58.124372  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:20:58.124406  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:20:58.157914  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:20:58.157947  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:20:58.195601  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:20:58.195631  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:20:58.229882  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:20:58.229914  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:20:58.260065  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:20:58.260094  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:20:58.273329  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:20:58.273362  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:00.811277  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:00.825994  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:00.826068  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:00.856559  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:00.856584  199924 cri.go:89] found id: ""
	I1201 20:21:00.856592  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:00.856647  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:00.861843  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:00.861917  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:00.893677  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:00.893703  199924 cri.go:89] found id: ""
	I1201 20:21:00.893712  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:00.893772  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:00.897841  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:00.897919  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:00.924164  199924 cri.go:89] found id: ""
	I1201 20:21:00.924190  199924 logs.go:282] 0 containers: []
	W1201 20:21:00.924207  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:00.924215  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:00.924273  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:00.950785  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:00.950807  199924 cri.go:89] found id: ""
	I1201 20:21:00.950815  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:00.950870  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:00.954885  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:00.954958  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:00.981269  199924 cri.go:89] found id: ""
	I1201 20:21:00.981300  199924 logs.go:282] 0 containers: []
	W1201 20:21:00.981310  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:00.981316  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:00.981379  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:01.010149  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:01.010234  199924 cri.go:89] found id: ""
	I1201 20:21:01.010257  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:01.010380  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:01.015899  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:01.016000  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:01.041940  199924 cri.go:89] found id: ""
	I1201 20:21:01.041964  199924 logs.go:282] 0 containers: []
	W1201 20:21:01.041974  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:01.041980  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:01.042038  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:01.067135  199924 cri.go:89] found id: ""
	I1201 20:21:01.067165  199924 logs.go:282] 0 containers: []
	W1201 20:21:01.067174  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:01.067190  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:01.067232  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:01.104579  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:01.104612  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:01.139207  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:01.139241  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:01.179854  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:01.179887  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:01.222068  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:01.222099  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:01.256332  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:01.256368  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:01.289148  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:01.289178  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:01.354621  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:01.354661  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:01.370249  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:01.370280  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:01.449833  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:03.950704  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:03.962783  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:03.962852  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:03.993696  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:03.993716  199924 cri.go:89] found id: ""
	I1201 20:21:03.993724  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:03.993780  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:04.003848  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:04.003921  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:04.052700  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:04.052719  199924 cri.go:89] found id: ""
	I1201 20:21:04.052727  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:04.052782  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:04.057102  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:04.057165  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:04.092540  199924 cri.go:89] found id: ""
	I1201 20:21:04.092562  199924 logs.go:282] 0 containers: []
	W1201 20:21:04.092571  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:04.092577  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:04.092642  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:04.131321  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:04.131340  199924 cri.go:89] found id: ""
	I1201 20:21:04.131348  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:04.131403  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:04.136085  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:04.136155  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:04.174783  199924 cri.go:89] found id: ""
	I1201 20:21:04.174806  199924 logs.go:282] 0 containers: []
	W1201 20:21:04.174815  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:04.174821  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:04.174880  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:04.222844  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:04.222863  199924 cri.go:89] found id: ""
	I1201 20:21:04.222872  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:04.222928  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:04.227099  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:04.227164  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:04.257599  199924 cri.go:89] found id: ""
	I1201 20:21:04.257624  199924 logs.go:282] 0 containers: []
	W1201 20:21:04.257633  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:04.257639  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:04.257699  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:04.298450  199924 cri.go:89] found id: ""
	I1201 20:21:04.298471  199924 logs.go:282] 0 containers: []
	W1201 20:21:04.298480  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:04.298500  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:04.298513  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:04.317964  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:04.317987  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:04.415076  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:04.415100  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:04.415114  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:04.451325  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:04.451356  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:04.484177  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:04.484212  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:04.518582  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:04.518614  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:04.552232  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:04.552268  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:04.616461  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:04.616493  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:04.657958  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:04.657992  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:07.188032  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:07.208664  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:07.208778  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:07.257820  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:07.257904  199924 cri.go:89] found id: ""
	I1201 20:21:07.257928  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:07.258016  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:07.263260  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:07.263351  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:07.309606  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:07.309632  199924 cri.go:89] found id: ""
	I1201 20:21:07.309649  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:07.309708  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:07.315497  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:07.315573  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:07.354114  199924 cri.go:89] found id: ""
	I1201 20:21:07.354141  199924 logs.go:282] 0 containers: []
	W1201 20:21:07.354159  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:07.354172  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:07.354233  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:07.414784  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:07.414809  199924 cri.go:89] found id: ""
	I1201 20:21:07.414817  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:07.414909  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:07.419894  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:07.419973  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:07.461461  199924 cri.go:89] found id: ""
	I1201 20:21:07.461514  199924 logs.go:282] 0 containers: []
	W1201 20:21:07.461547  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:07.461561  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:07.461636  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:07.505023  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:07.505049  199924 cri.go:89] found id: ""
	I1201 20:21:07.505057  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:07.505112  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:07.510108  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:07.510193  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:07.551517  199924 cri.go:89] found id: ""
	I1201 20:21:07.551546  199924 logs.go:282] 0 containers: []
	W1201 20:21:07.551555  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:07.551562  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:07.551624  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:07.592027  199924 cri.go:89] found id: ""
	I1201 20:21:07.592061  199924 logs.go:282] 0 containers: []
	W1201 20:21:07.592070  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:07.592083  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:07.592094  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:07.636447  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:07.636474  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:07.716409  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:07.716504  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:07.811916  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:07.811935  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:07.811948  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:07.865507  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:07.865542  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:07.899616  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:07.899652  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:07.933759  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:07.933794  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:07.947173  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:07.947201  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:07.980304  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:07.980336  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:10.518479  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:10.532737  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:10.532810  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:10.567489  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:10.567512  199924 cri.go:89] found id: ""
	I1201 20:21:10.567522  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:10.567579  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:10.572181  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:10.572262  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:10.605430  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:10.605455  199924 cri.go:89] found id: ""
	I1201 20:21:10.605464  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:10.605535  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:10.610525  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:10.610594  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:10.643020  199924 cri.go:89] found id: ""
	I1201 20:21:10.643047  199924 logs.go:282] 0 containers: []
	W1201 20:21:10.643055  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:10.643061  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:10.643118  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:10.678767  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:10.678791  199924 cri.go:89] found id: ""
	I1201 20:21:10.678800  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:10.678866  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:10.683322  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:10.683393  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:10.716825  199924 cri.go:89] found id: ""
	I1201 20:21:10.716851  199924 logs.go:282] 0 containers: []
	W1201 20:21:10.716860  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:10.716867  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:10.716927  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:10.753344  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:10.753368  199924 cri.go:89] found id: ""
	I1201 20:21:10.753377  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:10.753435  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:10.757800  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:10.757874  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:10.797242  199924 cri.go:89] found id: ""
	I1201 20:21:10.797278  199924 logs.go:282] 0 containers: []
	W1201 20:21:10.797288  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:10.797296  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:10.797359  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:10.832616  199924 cri.go:89] found id: ""
	I1201 20:21:10.832651  199924 logs.go:282] 0 containers: []
	W1201 20:21:10.832662  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:10.832675  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:10.832689  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:10.906979  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:10.907015  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:10.920790  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:10.920819  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:10.966291  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:10.966378  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:11.009016  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:11.009100  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:11.054619  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:11.054697  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:11.193727  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:11.193756  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:11.193770  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:11.246772  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:11.249595  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:11.297274  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:11.297314  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:13.855327  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:13.866181  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:13.866248  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:13.891061  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:13.891084  199924 cri.go:89] found id: ""
	I1201 20:21:13.891092  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:13.891150  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:13.895257  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:13.895331  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:13.922302  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:13.922325  199924 cri.go:89] found id: ""
	I1201 20:21:13.922333  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:13.922386  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:13.926232  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:13.926306  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:13.951248  199924 cri.go:89] found id: ""
	I1201 20:21:13.951270  199924 logs.go:282] 0 containers: []
	W1201 20:21:13.951279  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:13.951285  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:13.951343  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:13.984003  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:13.984079  199924 cri.go:89] found id: ""
	I1201 20:21:13.984101  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:13.984187  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:13.988524  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:13.988599  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:14.019237  199924 cri.go:89] found id: ""
	I1201 20:21:14.019260  199924 logs.go:282] 0 containers: []
	W1201 20:21:14.019268  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:14.019275  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:14.019333  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:14.053432  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:14.053534  199924 cri.go:89] found id: ""
	I1201 20:21:14.053559  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:14.053650  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:14.058225  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:14.058346  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:14.097564  199924 cri.go:89] found id: ""
	I1201 20:21:14.097630  199924 logs.go:282] 0 containers: []
	W1201 20:21:14.097653  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:14.097671  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:14.097752  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:14.144660  199924 cri.go:89] found id: ""
	I1201 20:21:14.144736  199924 logs.go:282] 0 containers: []
	W1201 20:21:14.144758  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:14.144804  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:14.144836  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:14.198432  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:14.198506  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:14.243120  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:14.243144  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:14.316491  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:14.316596  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:14.330910  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:14.330935  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:14.372596  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:14.372662  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:14.408110  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:14.408140  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:14.519997  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:14.520016  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:14.520028  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:14.570430  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:14.570507  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:17.124801  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:17.136119  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:17.136192  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:17.161963  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:17.161987  199924 cri.go:89] found id: ""
	I1201 20:21:17.161996  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:17.162051  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:17.165831  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:17.165904  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:17.191149  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:17.191171  199924 cri.go:89] found id: ""
	I1201 20:21:17.191180  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:17.191236  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:17.195079  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:17.195150  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:17.228494  199924 cri.go:89] found id: ""
	I1201 20:21:17.228518  199924 logs.go:282] 0 containers: []
	W1201 20:21:17.228528  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:17.228535  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:17.228593  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:17.255019  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:17.255043  199924 cri.go:89] found id: ""
	I1201 20:21:17.255051  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:17.255107  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:17.259187  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:17.259260  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:17.285011  199924 cri.go:89] found id: ""
	I1201 20:21:17.285033  199924 logs.go:282] 0 containers: []
	W1201 20:21:17.285042  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:17.285048  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:17.285104  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:17.310325  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:17.310344  199924 cri.go:89] found id: ""
	I1201 20:21:17.310352  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:17.310404  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:17.314309  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:17.314390  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:17.339305  199924 cri.go:89] found id: ""
	I1201 20:21:17.339330  199924 logs.go:282] 0 containers: []
	W1201 20:21:17.339339  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:17.339346  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:17.339438  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:17.371726  199924 cri.go:89] found id: ""
	I1201 20:21:17.371752  199924 logs.go:282] 0 containers: []
	W1201 20:21:17.371761  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:17.371778  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:17.371790  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:17.446924  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:17.446987  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:17.447008  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:17.497576  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:17.497611  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:17.540579  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:17.540611  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:17.570732  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:17.570760  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:17.629940  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:17.629961  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:17.645405  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:17.645433  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:17.680932  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:17.680962  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:17.728212  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:17.728244  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:20.266545  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:20.277623  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:20.277699  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:20.311647  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:20.311673  199924 cri.go:89] found id: ""
	I1201 20:21:20.311687  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:20.311742  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:20.317375  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:20.317445  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:20.354821  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:20.354845  199924 cri.go:89] found id: ""
	I1201 20:21:20.354852  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:20.354912  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:20.359945  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:20.360023  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:20.391107  199924 cri.go:89] found id: ""
	I1201 20:21:20.391133  199924 logs.go:282] 0 containers: []
	W1201 20:21:20.391143  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:20.391149  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:20.391205  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:20.421387  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:20.421410  199924 cri.go:89] found id: ""
	I1201 20:21:20.421418  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:20.421473  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:20.425308  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:20.425381  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:20.455712  199924 cri.go:89] found id: ""
	I1201 20:21:20.455739  199924 logs.go:282] 0 containers: []
	W1201 20:21:20.455747  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:20.455754  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:20.455815  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:20.489876  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:20.489901  199924 cri.go:89] found id: ""
	I1201 20:21:20.489909  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:20.489965  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:20.494226  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:20.494299  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:20.523594  199924 cri.go:89] found id: ""
	I1201 20:21:20.523621  199924 logs.go:282] 0 containers: []
	W1201 20:21:20.523630  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:20.523636  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:20.523693  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:20.555671  199924 cri.go:89] found id: ""
	I1201 20:21:20.555703  199924 logs.go:282] 0 containers: []
	W1201 20:21:20.555714  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:20.555727  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:20.555737  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:20.620676  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:20.620728  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:20.666061  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:20.666134  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:20.721327  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:20.721405  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:20.735693  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:20.735766  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:20.821060  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:20.821092  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:20.821106  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:20.891962  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:20.891996  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:20.955066  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:20.955100  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:21.004538  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:21.004573  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:23.544271  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:23.554854  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:23.554926  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:23.580534  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:23.580558  199924 cri.go:89] found id: ""
	I1201 20:21:23.580566  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:23.580620  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:23.584556  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:23.584633  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:23.610640  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:23.610664  199924 cri.go:89] found id: ""
	I1201 20:21:23.610672  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:23.610730  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:23.614681  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:23.614752  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:23.639875  199924 cri.go:89] found id: ""
	I1201 20:21:23.639897  199924 logs.go:282] 0 containers: []
	W1201 20:21:23.639906  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:23.639912  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:23.639971  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:23.672695  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:23.672718  199924 cri.go:89] found id: ""
	I1201 20:21:23.672726  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:23.672778  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:23.676585  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:23.676656  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:23.703040  199924 cri.go:89] found id: ""
	I1201 20:21:23.703063  199924 logs.go:282] 0 containers: []
	W1201 20:21:23.703071  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:23.703078  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:23.703136  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:23.729842  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:23.729863  199924 cri.go:89] found id: ""
	I1201 20:21:23.729872  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:23.729933  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:23.733854  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:23.733937  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:23.759406  199924 cri.go:89] found id: ""
	I1201 20:21:23.759434  199924 logs.go:282] 0 containers: []
	W1201 20:21:23.759443  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:23.759450  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:23.759507  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:23.785713  199924 cri.go:89] found id: ""
	I1201 20:21:23.785738  199924 logs.go:282] 0 containers: []
	W1201 20:21:23.785747  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:23.785765  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:23.785778  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:23.857250  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:23.857273  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:23.857289  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:23.897354  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:23.897384  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:23.936923  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:23.936954  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:23.978048  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:23.978080  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:24.017110  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:24.017157  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:24.072870  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:24.072906  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:24.105205  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:24.105232  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:24.167267  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:24.167301  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:26.681718  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:26.694326  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:26.694393  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:26.723764  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:26.723790  199924 cri.go:89] found id: ""
	I1201 20:21:26.723799  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:26.723868  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:26.728263  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:26.728332  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:26.763989  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:26.764011  199924 cri.go:89] found id: ""
	I1201 20:21:26.764020  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:26.764090  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:26.768261  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:26.768332  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:26.806222  199924 cri.go:89] found id: ""
	I1201 20:21:26.806247  199924 logs.go:282] 0 containers: []
	W1201 20:21:26.806255  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:26.806262  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:26.806317  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:26.847889  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:26.847912  199924 cri.go:89] found id: ""
	I1201 20:21:26.847921  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:26.847976  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:26.856157  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:26.856227  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:26.917762  199924 cri.go:89] found id: ""
	I1201 20:21:26.917782  199924 logs.go:282] 0 containers: []
	W1201 20:21:26.917791  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:26.917797  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:26.917854  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:26.982659  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:26.982683  199924 cri.go:89] found id: ""
	I1201 20:21:26.982692  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:26.982748  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:26.987616  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:26.987690  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:27.017644  199924 cri.go:89] found id: ""
	I1201 20:21:27.017673  199924 logs.go:282] 0 containers: []
	W1201 20:21:27.017683  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:27.017690  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:27.017750  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:27.050551  199924 cri.go:89] found id: ""
	I1201 20:21:27.050579  199924 logs.go:282] 0 containers: []
	W1201 20:21:27.050588  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:27.050601  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:27.050618  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:27.090165  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:27.090244  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:27.157416  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:27.157435  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:27.157447  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:27.193595  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:27.193628  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:27.232849  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:27.232881  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:27.265511  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:27.265536  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:27.326496  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:27.326532  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:27.339416  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:27.339451  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:27.372749  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:27.372780  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:29.910498  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:29.920982  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:29.921051  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:29.959019  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:29.959039  199924 cri.go:89] found id: ""
	I1201 20:21:29.959047  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:29.959104  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:29.963265  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:29.963349  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:29.989193  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:29.989268  199924 cri.go:89] found id: ""
	I1201 20:21:29.989292  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:29.989413  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:29.993376  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:29.993529  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:30.076108  199924 cri.go:89] found id: ""
	I1201 20:21:30.076138  199924 logs.go:282] 0 containers: []
	W1201 20:21:30.076149  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:30.076157  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:30.076226  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:30.117054  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:30.117081  199924 cri.go:89] found id: ""
	I1201 20:21:30.117091  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:30.117158  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:30.122398  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:30.122505  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:30.155806  199924 cri.go:89] found id: ""
	I1201 20:21:30.155833  199924 logs.go:282] 0 containers: []
	W1201 20:21:30.155842  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:30.155849  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:30.155939  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:30.185264  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:30.185289  199924 cri.go:89] found id: ""
	I1201 20:21:30.185299  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:30.185388  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:30.190216  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:30.190292  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:30.221352  199924 cri.go:89] found id: ""
	I1201 20:21:30.221376  199924 logs.go:282] 0 containers: []
	W1201 20:21:30.221388  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:30.221428  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:30.221541  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:30.247304  199924 cri.go:89] found id: ""
	I1201 20:21:30.247368  199924 logs.go:282] 0 containers: []
	W1201 20:21:30.247391  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:30.247418  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:30.247444  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:30.313079  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:30.313115  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:30.349057  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:30.349091  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:30.383293  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:30.383327  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:30.397180  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:30.397213  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:30.468341  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:30.468402  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:30.468429  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:30.499566  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:30.499599  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:30.533247  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:30.533279  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:30.566990  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:30.567028  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:33.118836  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:33.129826  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:33.129923  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:33.156372  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:33.156395  199924 cri.go:89] found id: ""
	I1201 20:21:33.156402  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:33.156455  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:33.160460  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:33.160530  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:33.186223  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:33.186290  199924 cri.go:89] found id: ""
	I1201 20:21:33.186313  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:33.186393  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:33.190306  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:33.190379  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:33.222943  199924 cri.go:89] found id: ""
	I1201 20:21:33.223017  199924 logs.go:282] 0 containers: []
	W1201 20:21:33.223033  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:33.223040  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:33.223114  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:33.248452  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:33.248475  199924 cri.go:89] found id: ""
	I1201 20:21:33.248483  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:33.248571  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:33.252861  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:33.252955  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:33.280296  199924 cri.go:89] found id: ""
	I1201 20:21:33.280320  199924 logs.go:282] 0 containers: []
	W1201 20:21:33.280329  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:33.280354  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:33.280435  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:33.315555  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:33.315579  199924 cri.go:89] found id: ""
	I1201 20:21:33.315590  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:33.315667  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:33.319965  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:33.320038  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:33.345320  199924 cri.go:89] found id: ""
	I1201 20:21:33.345345  199924 logs.go:282] 0 containers: []
	W1201 20:21:33.345353  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:33.345359  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:33.345416  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:33.370448  199924 cri.go:89] found id: ""
	I1201 20:21:33.370485  199924 logs.go:282] 0 containers: []
	W1201 20:21:33.370494  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:33.370509  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:33.370520  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:33.428360  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:33.428394  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:33.441854  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:33.441883  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:33.515515  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:33.515536  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:33.515548  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:33.551812  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:33.551841  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:33.586055  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:33.586128  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:33.623538  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:33.623616  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:33.672640  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:33.672719  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:33.707612  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:33.707643  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:36.242908  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:36.253642  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:36.253708  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:36.288029  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:36.288064  199924 cri.go:89] found id: ""
	I1201 20:21:36.288072  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:36.288135  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:36.292623  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:36.292717  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:36.324190  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:36.324211  199924 cri.go:89] found id: ""
	I1201 20:21:36.324219  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:36.324276  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:36.328145  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:36.328263  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:36.353930  199924 cri.go:89] found id: ""
	I1201 20:21:36.353960  199924 logs.go:282] 0 containers: []
	W1201 20:21:36.353968  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:36.353976  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:36.354034  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:36.382821  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:36.382845  199924 cri.go:89] found id: ""
	I1201 20:21:36.382854  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:36.382916  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:36.387037  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:36.387114  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:36.416001  199924 cri.go:89] found id: ""
	I1201 20:21:36.416026  199924 logs.go:282] 0 containers: []
	W1201 20:21:36.416035  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:36.416041  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:36.416098  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:36.441723  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:36.441746  199924 cri.go:89] found id: ""
	I1201 20:21:36.441755  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:36.441811  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:36.445814  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:36.445886  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:36.471229  199924 cri.go:89] found id: ""
	I1201 20:21:36.471254  199924 logs.go:282] 0 containers: []
	W1201 20:21:36.471264  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:36.471271  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:36.471348  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:36.498002  199924 cri.go:89] found id: ""
	I1201 20:21:36.498028  199924 logs.go:282] 0 containers: []
	W1201 20:21:36.498037  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:36.498054  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:36.498068  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:36.531613  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:36.531643  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:36.566887  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:36.566918  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:36.599212  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:36.599287  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:36.640355  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:36.640431  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:36.712694  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:36.712716  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:36.712729  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:36.743700  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:36.743730  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:36.802636  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:36.802714  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:36.815682  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:36.815761  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:39.349124  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:39.359934  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:39.360000  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:39.386480  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:39.386503  199924 cri.go:89] found id: ""
	I1201 20:21:39.386511  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:39.386565  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:39.390467  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:39.390540  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:39.415495  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:39.415516  199924 cri.go:89] found id: ""
	I1201 20:21:39.415525  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:39.415639  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:39.419893  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:39.419962  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:39.448127  199924 cri.go:89] found id: ""
	I1201 20:21:39.448153  199924 logs.go:282] 0 containers: []
	W1201 20:21:39.448164  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:39.448175  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:39.448246  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:39.476350  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:39.476371  199924 cri.go:89] found id: ""
	I1201 20:21:39.476379  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:39.476431  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:39.480223  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:39.480296  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:39.506916  199924 cri.go:89] found id: ""
	I1201 20:21:39.506942  199924 logs.go:282] 0 containers: []
	W1201 20:21:39.506952  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:39.506961  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:39.507038  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:39.532566  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:39.532588  199924 cri.go:89] found id: ""
	I1201 20:21:39.532597  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:39.532654  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:39.536634  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:39.536707  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:39.566156  199924 cri.go:89] found id: ""
	I1201 20:21:39.566189  199924 logs.go:282] 0 containers: []
	W1201 20:21:39.566199  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:39.566207  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:39.566272  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:39.593924  199924 cri.go:89] found id: ""
	I1201 20:21:39.593950  199924 logs.go:282] 0 containers: []
	W1201 20:21:39.593959  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:39.593973  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:39.593984  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:39.608371  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:39.608400  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:39.679748  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:39.679812  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:39.679834  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:39.715167  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:39.715201  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:39.775659  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:39.775697  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:39.810542  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:39.810573  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:39.843071  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:39.843103  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:39.886616  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:39.886645  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:39.919489  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:39.919521  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:42.456388  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:42.469000  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:42.469081  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:42.500015  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:42.500038  199924 cri.go:89] found id: ""
	I1201 20:21:42.500047  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:42.500102  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:42.504209  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:42.504281  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:42.530759  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:42.530783  199924 cri.go:89] found id: ""
	I1201 20:21:42.530791  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:42.530847  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:42.534807  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:42.534881  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:42.560989  199924 cri.go:89] found id: ""
	I1201 20:21:42.561012  199924 logs.go:282] 0 containers: []
	W1201 20:21:42.561021  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:42.561027  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:42.561086  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:42.593764  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:42.593787  199924 cri.go:89] found id: ""
	I1201 20:21:42.593795  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:42.593847  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:42.599086  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:42.599168  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:42.634554  199924 cri.go:89] found id: ""
	I1201 20:21:42.634579  199924 logs.go:282] 0 containers: []
	W1201 20:21:42.634588  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:42.634594  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:42.634650  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:42.661719  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:42.661752  199924 cri.go:89] found id: ""
	I1201 20:21:42.661761  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:42.661828  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:42.665560  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:42.665632  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:42.691394  199924 cri.go:89] found id: ""
	I1201 20:21:42.691433  199924 logs.go:282] 0 containers: []
	W1201 20:21:42.691442  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:42.691449  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:42.691545  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:42.719619  199924 cri.go:89] found id: ""
	I1201 20:21:42.719645  199924 logs.go:282] 0 containers: []
	W1201 20:21:42.719653  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:42.719667  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:42.719678  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:42.777358  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:42.777393  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:42.846350  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:42.846376  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:42.846388  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:42.879631  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:42.879661  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:42.920094  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:42.920123  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:42.955942  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:42.955969  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:42.968491  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:42.968519  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:43.003420  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:43.003453  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:43.040351  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:43.040382  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:45.574681  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:45.588203  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:45.588278  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:45.647905  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:45.647931  199924 cri.go:89] found id: ""
	I1201 20:21:45.647939  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:45.647999  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:45.660606  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:45.660690  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:45.707880  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:45.707904  199924 cri.go:89] found id: ""
	I1201 20:21:45.707912  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:45.707966  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:45.712689  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:45.712767  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:45.747832  199924 cri.go:89] found id: ""
	I1201 20:21:45.747860  199924 logs.go:282] 0 containers: []
	W1201 20:21:45.747869  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:45.747876  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:45.747932  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:45.780011  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:45.780036  199924 cri.go:89] found id: ""
	I1201 20:21:45.780044  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:45.780111  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:45.787536  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:45.787650  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:45.825995  199924 cri.go:89] found id: ""
	I1201 20:21:45.826022  199924 logs.go:282] 0 containers: []
	W1201 20:21:45.826031  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:45.826038  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:45.826093  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:45.859298  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:45.859323  199924 cri.go:89] found id: ""
	I1201 20:21:45.859331  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:45.859385  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:45.863637  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:45.863707  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:45.911520  199924 cri.go:89] found id: ""
	I1201 20:21:45.911547  199924 logs.go:282] 0 containers: []
	W1201 20:21:45.911555  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:45.911562  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:45.911620  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:45.952735  199924 cri.go:89] found id: ""
	I1201 20:21:45.952762  199924 logs.go:282] 0 containers: []
	W1201 20:21:45.952771  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:45.952785  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:45.952801  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:46.022506  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:46.022543  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:46.039524  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:46.039555  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:46.104932  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:46.105010  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:46.151305  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:46.151339  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:46.186124  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:46.186157  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:46.223973  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:46.223998  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:46.293064  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:46.293087  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:46.293102  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:46.330711  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:46.330744  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:48.902697  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:48.916984  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:48.917053  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:48.952514  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:48.952536  199924 cri.go:89] found id: ""
	I1201 20:21:48.952544  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:48.952599  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:48.956570  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:48.956642  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:49.005805  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:49.005828  199924 cri.go:89] found id: ""
	I1201 20:21:49.005837  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:49.005892  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:49.010635  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:49.010729  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:49.043829  199924 cri.go:89] found id: ""
	I1201 20:21:49.043857  199924 logs.go:282] 0 containers: []
	W1201 20:21:49.043867  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:49.043874  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:49.043942  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:49.094203  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:49.094247  199924 cri.go:89] found id: ""
	I1201 20:21:49.094256  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:49.094319  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:49.099443  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:49.099523  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:49.142802  199924 cri.go:89] found id: ""
	I1201 20:21:49.142828  199924 logs.go:282] 0 containers: []
	W1201 20:21:49.142836  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:49.142843  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:49.142900  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:49.179441  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:49.179478  199924 cri.go:89] found id: ""
	I1201 20:21:49.179487  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:49.179558  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:49.189295  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:49.189385  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:49.235936  199924 cri.go:89] found id: ""
	I1201 20:21:49.235960  199924 logs.go:282] 0 containers: []
	W1201 20:21:49.235976  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:49.235985  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:49.236081  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:49.291977  199924 cri.go:89] found id: ""
	I1201 20:21:49.292004  199924 logs.go:282] 0 containers: []
	W1201 20:21:49.292013  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:49.292030  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:49.292041  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:49.377108  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:49.377147  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:49.456518  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:49.456547  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:49.504405  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:49.504436  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:49.549649  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:49.549682  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:49.598948  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:49.599024  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:49.656506  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:49.656535  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:49.670053  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:49.670077  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:49.770824  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:49.770842  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:49.770855  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:52.305624  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:52.316940  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:52.317008  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:52.359826  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:52.359849  199924 cri.go:89] found id: ""
	I1201 20:21:52.359857  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:52.359914  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:52.364098  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:52.364174  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:52.399403  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:52.399429  199924 cri.go:89] found id: ""
	I1201 20:21:52.399438  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:52.399493  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:52.403915  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:52.403983  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:52.442343  199924 cri.go:89] found id: ""
	I1201 20:21:52.442363  199924 logs.go:282] 0 containers: []
	W1201 20:21:52.442371  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:52.442377  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:52.442436  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:52.475099  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:52.475125  199924 cri.go:89] found id: ""
	I1201 20:21:52.475134  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:52.475192  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:52.480043  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:52.480114  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:52.512159  199924 cri.go:89] found id: ""
	I1201 20:21:52.512192  199924 logs.go:282] 0 containers: []
	W1201 20:21:52.512203  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:52.512210  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:52.512275  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:52.542939  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:52.542963  199924 cri.go:89] found id: ""
	I1201 20:21:52.542972  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:52.543025  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:52.547556  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:52.547631  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:52.592555  199924 cri.go:89] found id: ""
	I1201 20:21:52.592578  199924 logs.go:282] 0 containers: []
	W1201 20:21:52.592586  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:52.592592  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:52.592649  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:52.679829  199924 cri.go:89] found id: ""
	I1201 20:21:52.679855  199924 logs.go:282] 0 containers: []
	W1201 20:21:52.679864  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:52.679880  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:52.679891  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:52.752830  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:52.752869  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:52.845150  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:52.845171  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:52.845187  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:52.886590  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:52.886625  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:52.934302  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:52.934333  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:52.979192  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:52.979225  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:53.023503  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:53.023539  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:53.040023  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:53.040051  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:53.075143  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:53.075177  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:55.609632  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:55.620976  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:55.621047  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:55.654491  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:55.654517  199924 cri.go:89] found id: ""
	I1201 20:21:55.654526  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:55.654582  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:55.659057  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:55.659127  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:55.684867  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:55.684892  199924 cri.go:89] found id: ""
	I1201 20:21:55.684900  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:55.684956  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:55.688903  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:55.688978  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:55.714038  199924 cri.go:89] found id: ""
	I1201 20:21:55.714062  199924 logs.go:282] 0 containers: []
	W1201 20:21:55.714070  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:55.714076  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:55.714136  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:55.739957  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:55.739980  199924 cri.go:89] found id: ""
	I1201 20:21:55.739988  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:55.740043  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:55.743837  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:55.743912  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:55.768928  199924 cri.go:89] found id: ""
	I1201 20:21:55.768953  199924 logs.go:282] 0 containers: []
	W1201 20:21:55.768963  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:55.768970  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:55.769027  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:55.794691  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:55.794719  199924 cri.go:89] found id: ""
	I1201 20:21:55.794729  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:55.794781  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:55.798507  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:55.798597  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:55.823333  199924 cri.go:89] found id: ""
	I1201 20:21:55.823358  199924 logs.go:282] 0 containers: []
	W1201 20:21:55.823367  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:55.823374  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:55.823428  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:55.852600  199924 cri.go:89] found id: ""
	I1201 20:21:55.852623  199924 logs.go:282] 0 containers: []
	W1201 20:21:55.852632  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:55.852646  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:55.852657  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:55.910052  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:55.910084  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:55.923187  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:55.923256  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:55.995621  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:55.995684  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:55.995710  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:56.032908  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:56.032979  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:56.069409  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:56.069663  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:56.110673  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:56.110766  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:56.152332  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:56.152402  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:56.205588  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:56.205661  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:58.753036  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:21:58.764036  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:21:58.764108  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:21:58.789093  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:58.789117  199924 cri.go:89] found id: ""
	I1201 20:21:58.789126  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:21:58.789179  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:58.792985  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:21:58.793056  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:21:58.818234  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:58.818259  199924 cri.go:89] found id: ""
	I1201 20:21:58.818268  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:21:58.818325  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:58.822075  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:21:58.822148  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:21:58.847886  199924 cri.go:89] found id: ""
	I1201 20:21:58.847911  199924 logs.go:282] 0 containers: []
	W1201 20:21:58.847920  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:21:58.847927  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:21:58.847983  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:21:58.873839  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:58.873868  199924 cri.go:89] found id: ""
	I1201 20:21:58.873876  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:21:58.873930  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:58.877941  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:21:58.878042  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:21:58.905660  199924 cri.go:89] found id: ""
	I1201 20:21:58.905684  199924 logs.go:282] 0 containers: []
	W1201 20:21:58.905693  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:21:58.905700  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:21:58.905756  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:21:58.931054  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:21:58.931074  199924 cri.go:89] found id: ""
	I1201 20:21:58.931082  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:21:58.931135  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:21:58.934840  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:21:58.934909  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:21:58.959631  199924 cri.go:89] found id: ""
	I1201 20:21:58.959655  199924 logs.go:282] 0 containers: []
	W1201 20:21:58.959664  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:21:58.959671  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:21:58.959732  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:21:58.983172  199924 cri.go:89] found id: ""
	I1201 20:21:58.983196  199924 logs.go:282] 0 containers: []
	W1201 20:21:58.983205  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:21:58.983221  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:21:58.983234  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:21:59.052236  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:21:59.052256  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:21:59.052269  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:21:59.083085  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:21:59.083117  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:21:59.116735  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:21:59.116769  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:21:59.145721  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:21:59.145750  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:21:59.203152  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:21:59.203186  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:21:59.225379  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:21:59.225407  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:21:59.259436  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:21:59.259469  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:21:59.292597  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:21:59.292632  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:22:01.826573  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:22:01.837355  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:22:01.837425  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:22:01.866856  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:22:01.866880  199924 cri.go:89] found id: ""
	I1201 20:22:01.866889  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:22:01.866944  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:01.870929  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:22:01.871003  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:22:01.897329  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:22:01.897354  199924 cri.go:89] found id: ""
	I1201 20:22:01.897362  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:22:01.897419  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:01.902357  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:22:01.902431  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:22:01.929379  199924 cri.go:89] found id: ""
	I1201 20:22:01.929408  199924 logs.go:282] 0 containers: []
	W1201 20:22:01.929418  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:22:01.929436  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:22:01.929546  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:22:01.956034  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:22:01.956065  199924 cri.go:89] found id: ""
	I1201 20:22:01.956073  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:22:01.956166  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:01.960231  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:22:01.960305  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:22:01.987616  199924 cri.go:89] found id: ""
	I1201 20:22:01.987640  199924 logs.go:282] 0 containers: []
	W1201 20:22:01.987648  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:22:01.987655  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:22:01.987742  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:22:02.015154  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:22:02.015190  199924 cri.go:89] found id: ""
	I1201 20:22:02.015200  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:22:02.015305  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:02.020581  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:22:02.020660  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:22:02.048776  199924 cri.go:89] found id: ""
	I1201 20:22:02.048798  199924 logs.go:282] 0 containers: []
	W1201 20:22:02.048808  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:22:02.048814  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:22:02.048871  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:22:02.087445  199924 cri.go:89] found id: ""
	I1201 20:22:02.087485  199924 logs.go:282] 0 containers: []
	W1201 20:22:02.087495  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:22:02.087522  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:22:02.087546  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:22:02.153028  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:22:02.153077  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:22:02.234533  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:22:02.234553  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:22:02.234568  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:22:02.247811  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:22:02.247839  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:22:02.294210  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:22:02.294281  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:22:02.331730  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:22:02.331764  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:22:02.368751  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:22:02.368784  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:22:02.417277  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:22:02.417309  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:22:02.452001  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:22:02.452038  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:22:04.981905  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:22:04.992751  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:22:04.992822  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:22:05.028732  199924 cri.go:89] found id: "95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:22:05.028757  199924 cri.go:89] found id: ""
	I1201 20:22:05.028767  199924 logs.go:282] 1 containers: [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a]
	I1201 20:22:05.028829  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:05.033211  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:22:05.033285  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:22:05.062371  199924 cri.go:89] found id: "2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:22:05.062394  199924 cri.go:89] found id: ""
	I1201 20:22:05.062401  199924 logs.go:282] 1 containers: [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52]
	I1201 20:22:05.062456  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:05.066459  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:22:05.066529  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:22:05.092455  199924 cri.go:89] found id: ""
	I1201 20:22:05.092479  199924 logs.go:282] 0 containers: []
	W1201 20:22:05.092487  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:22:05.092493  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:22:05.092549  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:22:05.124490  199924 cri.go:89] found id: "f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:22:05.124574  199924 cri.go:89] found id: ""
	I1201 20:22:05.124598  199924 logs.go:282] 1 containers: [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26]
	I1201 20:22:05.124683  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:05.128785  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:22:05.128881  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:22:05.154627  199924 cri.go:89] found id: ""
	I1201 20:22:05.154654  199924 logs.go:282] 0 containers: []
	W1201 20:22:05.154663  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:22:05.154670  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:22:05.154748  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:22:05.181341  199924 cri.go:89] found id: "f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:22:05.181366  199924 cri.go:89] found id: ""
	I1201 20:22:05.181391  199924 logs.go:282] 1 containers: [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129]
	I1201 20:22:05.181446  199924 ssh_runner.go:195] Run: which crictl
	I1201 20:22:05.185448  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:22:05.185543  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:22:05.212784  199924 cri.go:89] found id: ""
	I1201 20:22:05.212811  199924 logs.go:282] 0 containers: []
	W1201 20:22:05.212820  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:22:05.212826  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:22:05.212888  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:22:05.239096  199924 cri.go:89] found id: ""
	I1201 20:22:05.239119  199924 logs.go:282] 0 containers: []
	W1201 20:22:05.239128  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:22:05.239143  199924 logs.go:123] Gathering logs for etcd [2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52] ...
	I1201 20:22:05.239155  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52"
	I1201 20:22:05.272682  199924 logs.go:123] Gathering logs for kube-scheduler [f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26] ...
	I1201 20:22:05.272714  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26"
	I1201 20:22:05.318329  199924 logs.go:123] Gathering logs for kube-controller-manager [f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129] ...
	I1201 20:22:05.318363  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129"
	I1201 20:22:05.359800  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:22:05.359836  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1201 20:22:05.410002  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:22:05.410034  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:22:05.477314  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:22:05.477350  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:22:05.512825  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:22:05.512857  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:22:05.526457  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:22:05.526491  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:22:05.596567  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:22:05.596631  199924 logs.go:123] Gathering logs for kube-apiserver [95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a] ...
	I1201 20:22:05.596651  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a"
	I1201 20:22:08.131328  199924 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:22:08.143441  199924 kubeadm.go:602] duration metric: took 4m4.529289193s to restartPrimaryControlPlane
	W1201 20:22:08.143523  199924 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1201 20:22:08.143621  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 20:22:08.633018  199924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 20:22:08.647190  199924 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 20:22:08.655747  199924 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 20:22:08.655820  199924 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 20:22:08.664624  199924 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 20:22:08.664648  199924 kubeadm.go:158] found existing configuration files:
	
	I1201 20:22:08.664702  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1201 20:22:08.673629  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 20:22:08.673720  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 20:22:08.682663  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1201 20:22:08.691160  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 20:22:08.691247  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 20:22:08.699239  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1201 20:22:08.707681  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 20:22:08.707775  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 20:22:08.715774  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1201 20:22:08.724502  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 20:22:08.724590  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 20:22:08.732945  199924 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 20:22:08.771480  199924 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 20:22:08.771792  199924 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 20:22:08.846722  199924 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 20:22:08.846799  199924 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 20:22:08.846843  199924 kubeadm.go:319] OS: Linux
	I1201 20:22:08.846894  199924 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 20:22:08.846946  199924 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 20:22:08.846997  199924 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 20:22:08.847049  199924 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 20:22:08.847100  199924 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 20:22:08.847152  199924 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 20:22:08.847202  199924 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 20:22:08.847254  199924 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 20:22:08.847305  199924 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 20:22:08.910006  199924 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 20:22:08.910122  199924 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 20:22:08.910227  199924 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 20:22:08.921986  199924 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 20:22:08.927052  199924 out.go:252]   - Generating certificates and keys ...
	I1201 20:22:08.927165  199924 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 20:22:08.927268  199924 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 20:22:08.927380  199924 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 20:22:08.927482  199924 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 20:22:08.927623  199924 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 20:22:08.927703  199924 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 20:22:08.927819  199924 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 20:22:08.927912  199924 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 20:22:08.928037  199924 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 20:22:08.928141  199924 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 20:22:08.928187  199924 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 20:22:08.928250  199924 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 20:22:09.311563  199924 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 20:22:09.900695  199924 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 20:22:10.000253  199924 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 20:22:10.531608  199924 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 20:22:10.718299  199924 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 20:22:10.721211  199924 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 20:22:10.725980  199924 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 20:22:10.729381  199924 out.go:252]   - Booting up control plane ...
	I1201 20:22:10.729601  199924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 20:22:10.729715  199924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 20:22:10.730796  199924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 20:22:10.760730  199924 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 20:22:10.760850  199924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 20:22:10.770837  199924 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 20:22:10.770939  199924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 20:22:10.770979  199924 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 20:22:10.946682  199924 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 20:22:10.946802  199924 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 20:26:10.946847  199924 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.00020429s
	I1201 20:26:10.946879  199924 kubeadm.go:319] 
	I1201 20:26:10.946934  199924 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 20:26:10.946965  199924 kubeadm.go:319] 	- The kubelet is not running
	I1201 20:26:10.947064  199924 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 20:26:10.947069  199924 kubeadm.go:319] 
	I1201 20:26:10.947167  199924 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 20:26:10.947197  199924 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 20:26:10.947226  199924 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 20:26:10.947230  199924 kubeadm.go:319] 
	I1201 20:26:10.951123  199924 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 20:26:10.951583  199924 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 20:26:10.951740  199924 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 20:26:10.952017  199924 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1201 20:26:10.952025  199924 kubeadm.go:319] 
	I1201 20:26:10.952139  199924 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1201 20:26:10.952227  199924 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.00020429s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.00020429s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1201 20:26:10.952308  199924 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1201 20:26:11.362963  199924 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 20:26:11.377248  199924 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 20:26:11.377317  199924 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 20:26:11.385864  199924 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 20:26:11.385922  199924 kubeadm.go:158] found existing configuration files:
	
	I1201 20:26:11.385986  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1201 20:26:11.394131  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 20:26:11.394200  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 20:26:11.402031  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1201 20:26:11.410138  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 20:26:11.410249  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 20:26:11.418140  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1201 20:26:11.426364  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 20:26:11.426437  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 20:26:11.434186  199924 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1201 20:26:11.442749  199924 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 20:26:11.442818  199924 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 20:26:11.450989  199924 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 20:26:11.492253  199924 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1201 20:26:11.492323  199924 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 20:26:11.583738  199924 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 20:26:11.583904  199924 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 20:26:11.583987  199924 kubeadm.go:319] OS: Linux
	I1201 20:26:11.584068  199924 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 20:26:11.584154  199924 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 20:26:11.584237  199924 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 20:26:11.584327  199924 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 20:26:11.584408  199924 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 20:26:11.584494  199924 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 20:26:11.584574  199924 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 20:26:11.584657  199924 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 20:26:11.584743  199924 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 20:26:11.656583  199924 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 20:26:11.656699  199924 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 20:26:11.656796  199924 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 20:26:11.669942  199924 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 20:26:11.675812  199924 out.go:252]   - Generating certificates and keys ...
	I1201 20:26:11.675999  199924 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 20:26:11.676107  199924 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 20:26:11.676256  199924 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1201 20:26:11.676345  199924 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1201 20:26:11.676452  199924 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1201 20:26:11.676525  199924 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1201 20:26:11.676677  199924 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1201 20:26:11.676771  199924 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1201 20:26:11.676915  199924 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1201 20:26:11.677005  199924 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1201 20:26:11.677050  199924 kubeadm.go:319] [certs] Using the existing "sa" key
	I1201 20:26:11.677116  199924 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 20:26:11.908366  199924 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 20:26:12.179448  199924 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 20:26:12.697405  199924 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 20:26:12.909882  199924 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 20:26:13.148867  199924 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 20:26:13.150877  199924 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 20:26:13.153744  199924 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 20:26:13.157119  199924 out.go:252]   - Booting up control plane ...
	I1201 20:26:13.157238  199924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 20:26:13.157322  199924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 20:26:13.157813  199924 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 20:26:13.182079  199924 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 20:26:13.182258  199924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 20:26:13.190195  199924 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 20:26:13.190615  199924 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 20:26:13.190887  199924 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 20:26:13.329777  199924 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 20:26:13.329982  199924 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 20:30:13.324007  199924 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000714482s
	I1201 20:30:13.324103  199924 kubeadm.go:319] 
	I1201 20:30:13.324197  199924 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 20:30:13.324268  199924 kubeadm.go:319] 	- The kubelet is not running
	I1201 20:30:13.324416  199924 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 20:30:13.324455  199924 kubeadm.go:319] 
	I1201 20:30:13.324597  199924 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 20:30:13.324655  199924 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 20:30:13.324713  199924 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 20:30:13.324738  199924 kubeadm.go:319] 
	I1201 20:30:13.328109  199924 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 20:30:13.328573  199924 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 20:30:13.328702  199924 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 20:30:13.328942  199924 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1201 20:30:13.328948  199924 kubeadm.go:319] 
	I1201 20:30:13.329017  199924 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 20:30:13.329075  199924 kubeadm.go:403] duration metric: took 12m9.776495224s to StartCluster
	I1201 20:30:13.329110  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:30:13.329170  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:30:13.354820  199924 cri.go:89] found id: ""
	I1201 20:30:13.354843  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.354852  199924 logs.go:284] No container was found matching "kube-apiserver"
	I1201 20:30:13.354859  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:30:13.354920  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:30:13.387756  199924 cri.go:89] found id: ""
	I1201 20:30:13.387835  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.387857  199924 logs.go:284] No container was found matching "etcd"
	I1201 20:30:13.387875  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:30:13.387957  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:30:13.413213  199924 cri.go:89] found id: ""
	I1201 20:30:13.413235  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.413243  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:30:13.413250  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:30:13.413310  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:30:13.452434  199924 cri.go:89] found id: ""
	I1201 20:30:13.452499  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.452531  199924 logs.go:284] No container was found matching "kube-scheduler"
	I1201 20:30:13.452553  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:30:13.452664  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:30:13.479043  199924 cri.go:89] found id: ""
	I1201 20:30:13.479068  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.479078  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:30:13.479085  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:30:13.479145  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:30:13.504793  199924 cri.go:89] found id: ""
	I1201 20:30:13.504828  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.504837  199924 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 20:30:13.504861  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:30:13.504943  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:30:13.532075  199924 cri.go:89] found id: ""
	I1201 20:30:13.532107  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.532116  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:30:13.532139  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:30:13.532219  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:30:13.558878  199924 cri.go:89] found id: ""
	I1201 20:30:13.558945  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.558967  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:30:13.558991  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:30:13.559009  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:30:13.618214  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:30:13.618256  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:30:13.636942  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:30:13.637021  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:30:13.713742  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:30:13.713810  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:30:13.713849  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:30:13.752642  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:30:13.752676  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 20:30:13.786390  199924 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 20:30:13.786444  199924 out.go:285] * 
	* 
	W1201 20:30:13.786510  199924 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 20:30:13.786528  199924 out.go:285] * 
	* 
	W1201 20:30:13.788983  199924 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 20:30:13.794629  199924 out.go:203] 
	W1201 20:30:13.797610  199924 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 20:30:13.797667  199924 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 20:30:13.797690  199924 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	* Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 20:30:13.801069  199924 out.go:203] 

                                                
                                                
** /stderr **
version_upgrade_test.go:245: failed to upgrade with newest k8s version. args: out/minikube-linux-arm64 start -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd : exit status 109
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-846544 version --output=json
version_upgrade_test.go:248: (dbg) Non-zero exit: kubectl --context kubernetes-upgrade-846544 version --output=json: exit status 1 (91.072847ms)

                                                
                                                
-- stdout --
	{
	  "clientVersion": {
	    "major": "1",
	    "minor": "33",
	    "gitVersion": "v1.33.2",
	    "gitCommit": "a57b6f7709f6c2722b92f07b8b4c48210a51fc40",
	    "gitTreeState": "clean",
	    "buildDate": "2025-06-17T18:41:31Z",
	    "goVersion": "go1.24.4",
	    "compiler": "gc",
	    "platform": "linux/arm64"
	  },
	  "kustomizeVersion": "v5.6.0"
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.76.2:8443 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
version_upgrade_test.go:250: error running kubectl: exit status 1
panic.go:615: *** TestKubernetesUpgrade FAILED at 2025-12-01 20:30:14.573451737 +0000 UTC m=+5098.474788887
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestKubernetesUpgrade]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestKubernetesUpgrade]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect kubernetes-upgrade-846544
helpers_test.go:243: (dbg) docker inspect kubernetes-upgrade-846544:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36",
	        "Created": "2025-12-01T20:17:10.797091352Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 200055,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-01T20:17:43.627897271Z",
	            "FinishedAt": "2025-12-01T20:17:42.484873887Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36/hostname",
	        "HostsPath": "/var/lib/docker/containers/adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36/hosts",
	        "LogPath": "/var/lib/docker/containers/adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36/adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36-json.log",
	        "Name": "/kubernetes-upgrade-846544",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "kubernetes-upgrade-846544:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "kubernetes-upgrade-846544",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "adef4b32c361690f724711c222eebfb7af4b398d206448f52e2067046aea2a36",
	                "LowerDir": "/var/lib/docker/overlay2/5f5c28866c68682556f5545959a721828ac95d8e6b809ff4dd0c0c1e0785d562-init/diff:/var/lib/docker/overlay2/d615a1a7c8a8c16c226473407fa1a9f3f15588b2e938958b41966d29d830ad8b/diff",
	                "MergedDir": "/var/lib/docker/overlay2/5f5c28866c68682556f5545959a721828ac95d8e6b809ff4dd0c0c1e0785d562/merged",
	                "UpperDir": "/var/lib/docker/overlay2/5f5c28866c68682556f5545959a721828ac95d8e6b809ff4dd0c0c1e0785d562/diff",
	                "WorkDir": "/var/lib/docker/overlay2/5f5c28866c68682556f5545959a721828ac95d8e6b809ff4dd0c0c1e0785d562/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "kubernetes-upgrade-846544",
	                "Source": "/var/lib/docker/volumes/kubernetes-upgrade-846544/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "kubernetes-upgrade-846544",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "kubernetes-upgrade-846544",
	                "name.minikube.sigs.k8s.io": "kubernetes-upgrade-846544",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "4a0899ff4947f441e188f75aa9b1a540adad5a0b7479f02dab0f87a2377ff868",
	            "SandboxKey": "/var/run/docker/netns/4a0899ff4947",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33008"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33009"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33012"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33010"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33011"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "kubernetes-upgrade-846544": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.76.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "f6:ea:b6:62:b2:c1",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f87e76bd53999367c30510803dc85cb28093abd8cd59cd3b89812fd82bc8463a",
	                    "EndpointID": "a3a8b3b27530af9fd2b3bc634aa2271aaacff4282443b49ae684f4a64c41243c",
	                    "Gateway": "192.168.76.1",
	                    "IPAddress": "192.168.76.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "kubernetes-upgrade-846544",
	                        "adef4b32c361"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-846544 -n kubernetes-upgrade-846544
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-846544 -n kubernetes-upgrade-846544: exit status 2 (317.874591ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestKubernetesUpgrade FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestKubernetesUpgrade]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-846544 logs -n 25
helpers_test.go:260: TestKubernetesUpgrade logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬─────────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                         ARGS                                                                          │           PROFILE           │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼─────────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ delete  │ -p insufficient-storage-895439                                                                                                                        │ insufficient-storage-895439 │ jenkins │ v1.37.0 │ 01 Dec 25 20:15 UTC │ 01 Dec 25 20:15 UTC │
	│ start   │ -p NoKubernetes-178134 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd                                   │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:15 UTC │                     │
	│ start   │ -p NoKubernetes-178134 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                                           │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:15 UTC │ 01 Dec 25 20:16 UTC │
	│ start   │ -p missing-upgrade-847129 --memory=3072 --driver=docker  --container-runtime=containerd                                                               │ missing-upgrade-847129      │ jenkins │ v1.35.0 │ 01 Dec 25 20:16 UTC │ 01 Dec 25 20:17 UTC │
	│ start   │ -p NoKubernetes-178134 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                           │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:16 UTC │ 01 Dec 25 20:16 UTC │
	│ delete  │ -p NoKubernetes-178134                                                                                                                                │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:16 UTC │ 01 Dec 25 20:16 UTC │
	│ start   │ -p NoKubernetes-178134 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                           │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:16 UTC │ 01 Dec 25 20:16 UTC │
	│ ssh     │ -p NoKubernetes-178134 sudo systemctl is-active --quiet service kubelet                                                                               │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:16 UTC │                     │
	│ stop    │ -p NoKubernetes-178134                                                                                                                                │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:16 UTC │ 01 Dec 25 20:16 UTC │
	│ start   │ -p NoKubernetes-178134 --driver=docker  --container-runtime=containerd                                                                                │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:16 UTC │ 01 Dec 25 20:17 UTC │
	│ ssh     │ -p NoKubernetes-178134 sudo systemctl is-active --quiet service kubelet                                                                               │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:17 UTC │                     │
	│ delete  │ -p NoKubernetes-178134                                                                                                                                │ NoKubernetes-178134         │ jenkins │ v1.37.0 │ 01 Dec 25 20:17 UTC │ 01 Dec 25 20:17 UTC │
	│ start   │ -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd        │ kubernetes-upgrade-846544   │ jenkins │ v1.37.0 │ 01 Dec 25 20:17 UTC │ 01 Dec 25 20:17 UTC │
	│ start   │ -p missing-upgrade-847129 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                        │ missing-upgrade-847129      │ jenkins │ v1.37.0 │ 01 Dec 25 20:17 UTC │ 01 Dec 25 20:18 UTC │
	│ stop    │ -p kubernetes-upgrade-846544                                                                                                                          │ kubernetes-upgrade-846544   │ jenkins │ v1.37.0 │ 01 Dec 25 20:17 UTC │ 01 Dec 25 20:17 UTC │
	│ start   │ -p kubernetes-upgrade-846544 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd │ kubernetes-upgrade-846544   │ jenkins │ v1.37.0 │ 01 Dec 25 20:17 UTC │                     │
	│ delete  │ -p missing-upgrade-847129                                                                                                                             │ missing-upgrade-847129      │ jenkins │ v1.37.0 │ 01 Dec 25 20:18 UTC │ 01 Dec 25 20:19 UTC │
	│ start   │ -p stopped-upgrade-632869 --memory=3072 --vm-driver=docker  --container-runtime=containerd                                                            │ stopped-upgrade-632869      │ jenkins │ v1.35.0 │ 01 Dec 25 20:19 UTC │ 01 Dec 25 20:19 UTC │
	│ stop    │ stopped-upgrade-632869 stop                                                                                                                           │ stopped-upgrade-632869      │ jenkins │ v1.35.0 │ 01 Dec 25 20:19 UTC │ 01 Dec 25 20:19 UTC │
	│ start   │ -p stopped-upgrade-632869 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                        │ stopped-upgrade-632869      │ jenkins │ v1.37.0 │ 01 Dec 25 20:19 UTC │ 01 Dec 25 20:24 UTC │
	│ delete  │ -p stopped-upgrade-632869                                                                                                                             │ stopped-upgrade-632869      │ jenkins │ v1.37.0 │ 01 Dec 25 20:24 UTC │ 01 Dec 25 20:24 UTC │
	│ start   │ -p running-upgrade-960321 --memory=3072 --vm-driver=docker  --container-runtime=containerd                                                            │ running-upgrade-960321      │ jenkins │ v1.35.0 │ 01 Dec 25 20:24 UTC │ 01 Dec 25 20:24 UTC │
	│ start   │ -p running-upgrade-960321 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                        │ running-upgrade-960321      │ jenkins │ v1.37.0 │ 01 Dec 25 20:24 UTC │ 01 Dec 25 20:29 UTC │
	│ delete  │ -p running-upgrade-960321                                                                                                                             │ running-upgrade-960321      │ jenkins │ v1.37.0 │ 01 Dec 25 20:29 UTC │ 01 Dec 25 20:29 UTC │
	│ start   │ -p pause-916050 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd                                       │ pause-916050                │ jenkins │ v1.37.0 │ 01 Dec 25 20:29 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴─────────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 20:29:32
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 20:29:32.450680  240007 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:29:32.450798  240007 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:29:32.450802  240007 out.go:374] Setting ErrFile to fd 2...
	I1201 20:29:32.450805  240007 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:29:32.451127  240007 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:29:32.451614  240007 out.go:368] Setting JSON to false
	I1201 20:29:32.452673  240007 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":7924,"bootTime":1764613049,"procs":181,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 20:29:32.452747  240007 start.go:143] virtualization:  
	I1201 20:29:32.457579  240007 out.go:179] * [pause-916050] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 20:29:32.460259  240007 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 20:29:32.460358  240007 notify.go:221] Checking for updates...
	I1201 20:29:32.466169  240007 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 20:29:32.469505  240007 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 20:29:32.472333  240007 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 20:29:32.475221  240007 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 20:29:32.478066  240007 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 20:29:32.481356  240007 config.go:182] Loaded profile config "kubernetes-upgrade-846544": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 20:29:32.481456  240007 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 20:29:32.508532  240007 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 20:29:32.508651  240007 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 20:29:32.616195  240007 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:37 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 20:29:32.605899631 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 20:29:32.616289  240007 docker.go:319] overlay module found
	I1201 20:29:32.619346  240007 out.go:179] * Using the docker driver based on user configuration
	I1201 20:29:32.622262  240007 start.go:309] selected driver: docker
	I1201 20:29:32.622271  240007 start.go:927] validating driver "docker" against <nil>
	I1201 20:29:32.622282  240007 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 20:29:32.623033  240007 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 20:29:32.689012  240007 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:37 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 20:29:32.678747623 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 20:29:32.689148  240007 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1201 20:29:32.689391  240007 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1201 20:29:32.692228  240007 out.go:179] * Using Docker driver with root privileges
	I1201 20:29:32.695068  240007 cni.go:84] Creating CNI manager for ""
	I1201 20:29:32.695122  240007 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 20:29:32.695130  240007 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1201 20:29:32.695207  240007 start.go:353] cluster config:
	{Name:pause-916050 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-916050 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:c
ontainerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentP
ID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 20:29:32.698445  240007 out.go:179] * Starting "pause-916050" primary control-plane node in "pause-916050" cluster
	I1201 20:29:32.701276  240007 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 20:29:32.704219  240007 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1201 20:29:32.707071  240007 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1201 20:29:32.707111  240007 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1201 20:29:32.707121  240007 cache.go:65] Caching tarball of preloaded images
	I1201 20:29:32.707130  240007 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 20:29:32.707209  240007 preload.go:238] Found /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1201 20:29:32.707217  240007 cache.go:68] Finished verifying existence of preloaded tar for v1.34.2 on containerd
	I1201 20:29:32.707342  240007 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/config.json ...
	I1201 20:29:32.707359  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/config.json: {Name:mke8a7160a75499e3ab786e49637d03f9e36c29e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:32.727352  240007 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1201 20:29:32.727362  240007 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	I1201 20:29:32.727375  240007 cache.go:243] Successfully downloaded all kic artifacts
	I1201 20:29:32.727408  240007 start.go:360] acquireMachinesLock for pause-916050: {Name:mkafa13b1ee49be856af8a510a0d38170fb2d678 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1201 20:29:32.727562  240007 start.go:364] duration metric: took 104.682µs to acquireMachinesLock for "pause-916050"
	I1201 20:29:32.727587  240007 start.go:93] Provisioning new machine with config: &{Name:pause-916050 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-916050 Namespace:default APIServerHAVIP: APIServerName:minik
ubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirm
warePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 20:29:32.727658  240007 start.go:125] createHost starting for "" (driver="docker")
	I1201 20:29:32.730941  240007 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1201 20:29:32.731172  240007 start.go:159] libmachine.API.Create for "pause-916050" (driver="docker")
	I1201 20:29:32.731209  240007 client.go:173] LocalClient.Create starting
	I1201 20:29:32.731301  240007 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem
	I1201 20:29:32.731337  240007 main.go:143] libmachine: Decoding PEM data...
	I1201 20:29:32.731351  240007 main.go:143] libmachine: Parsing certificate...
	I1201 20:29:32.731437  240007 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem
	I1201 20:29:32.731455  240007 main.go:143] libmachine: Decoding PEM data...
	I1201 20:29:32.731471  240007 main.go:143] libmachine: Parsing certificate...
	I1201 20:29:32.731831  240007 cli_runner.go:164] Run: docker network inspect pause-916050 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1201 20:29:32.747735  240007 cli_runner.go:211] docker network inspect pause-916050 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1201 20:29:32.747808  240007 network_create.go:284] running [docker network inspect pause-916050] to gather additional debugging logs...
	I1201 20:29:32.747823  240007 cli_runner.go:164] Run: docker network inspect pause-916050
	W1201 20:29:32.762329  240007 cli_runner.go:211] docker network inspect pause-916050 returned with exit code 1
	I1201 20:29:32.762347  240007 network_create.go:287] error running [docker network inspect pause-916050]: docker network inspect pause-916050: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network pause-916050 not found
	I1201 20:29:32.762358  240007 network_create.go:289] output of [docker network inspect pause-916050]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network pause-916050 not found
	
	** /stderr **
	I1201 20:29:32.762450  240007 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 20:29:32.779170  240007 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-4828e2f47bd3 IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:ba:78:79:c6:63:d1} reservation:<nil>}
	I1201 20:29:32.779514  240007 network.go:211] skipping subnet 192.168.58.0/24 that is taken: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName:br-80ec9dbbcada IfaceIPv4:192.168.58.1 IfaceMTU:1500 IfaceMAC:06:5b:ae:3f:75:a6} reservation:<nil>}
	I1201 20:29:32.779771  240007 network.go:211] skipping subnet 192.168.67.0/24 that is taken: &{IP:192.168.67.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.67.0/24 Gateway:192.168.67.1 ClientMin:192.168.67.2 ClientMax:192.168.67.254 Broadcast:192.168.67.255 IsPrivate:true Interface:{IfaceName:br-a3e6d0276c12 IfaceIPv4:192.168.67.1 IfaceMTU:1500 IfaceMAC:6e:ca:21:95:59:dc} reservation:<nil>}
	I1201 20:29:32.780063  240007 network.go:211] skipping subnet 192.168.76.0/24 that is taken: &{IP:192.168.76.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.76.0/24 Gateway:192.168.76.1 ClientMin:192.168.76.2 ClientMax:192.168.76.254 Broadcast:192.168.76.255 IsPrivate:true Interface:{IfaceName:br-f87e76bd5399 IfaceIPv4:192.168.76.1 IfaceMTU:1500 IfaceMAC:52:3b:14:d0:f7:3d} reservation:<nil>}
	I1201 20:29:32.780428  240007 network.go:206] using free private subnet 192.168.85.0/24: &{IP:192.168.85.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.85.0/24 Gateway:192.168.85.1 ClientMin:192.168.85.2 ClientMax:192.168.85.254 Broadcast:192.168.85.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001a04770}
	I1201 20:29:32.780442  240007 network_create.go:124] attempt to create docker network pause-916050 192.168.85.0/24 with gateway 192.168.85.1 and MTU of 1500 ...
	I1201 20:29:32.780504  240007 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.85.0/24 --gateway=192.168.85.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=pause-916050 pause-916050
	I1201 20:29:32.851725  240007 network_create.go:108] docker network pause-916050 192.168.85.0/24 created
	I1201 20:29:32.851753  240007 kic.go:121] calculated static IP "192.168.85.2" for the "pause-916050" container
	I1201 20:29:32.851835  240007 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1201 20:29:32.875627  240007 cli_runner.go:164] Run: docker volume create pause-916050 --label name.minikube.sigs.k8s.io=pause-916050 --label created_by.minikube.sigs.k8s.io=true
	I1201 20:29:32.895626  240007 oci.go:103] Successfully created a docker volume pause-916050
	I1201 20:29:32.895710  240007 cli_runner.go:164] Run: docker run --rm --name pause-916050-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=pause-916050 --entrypoint /usr/bin/test -v pause-916050:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -d /var/lib
	I1201 20:29:33.444273  240007 oci.go:107] Successfully prepared a docker volume pause-916050
	I1201 20:29:33.444328  240007 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1201 20:29:33.444336  240007 kic.go:194] Starting extracting preloaded images to volume ...
	I1201 20:29:33.444399  240007 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v pause-916050:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -I lz4 -xf /preloaded.tar -C /extractDir
	I1201 20:29:37.477719  240007 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v pause-916050:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -I lz4 -xf /preloaded.tar -C /extractDir: (4.033279553s)
	I1201 20:29:37.477740  240007 kic.go:203] duration metric: took 4.033401589s to extract preloaded images to volume ...
	W1201 20:29:37.477897  240007 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1201 20:29:37.478009  240007 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1201 20:29:37.532991  240007 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname pause-916050 --name pause-916050 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=pause-916050 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=pause-916050 --network pause-916050 --ip 192.168.85.2 --volume pause-916050:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b
	I1201 20:29:37.852389  240007 cli_runner.go:164] Run: docker container inspect pause-916050 --format={{.State.Running}}
	I1201 20:29:37.872727  240007 cli_runner.go:164] Run: docker container inspect pause-916050 --format={{.State.Status}}
	I1201 20:29:37.894781  240007 cli_runner.go:164] Run: docker exec pause-916050 stat /var/lib/dpkg/alternatives/iptables
	I1201 20:29:37.953104  240007 oci.go:144] the created container "pause-916050" has a running status.
	I1201 20:29:37.953123  240007 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa...
	I1201 20:29:38.291396  240007 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1201 20:29:38.313303  240007 cli_runner.go:164] Run: docker container inspect pause-916050 --format={{.State.Status}}
	I1201 20:29:38.351325  240007 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1201 20:29:38.351336  240007 kic_runner.go:114] Args: [docker exec --privileged pause-916050 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1201 20:29:38.402962  240007 cli_runner.go:164] Run: docker container inspect pause-916050 --format={{.State.Status}}
	I1201 20:29:38.437818  240007 machine.go:94] provisionDockerMachine start ...
	I1201 20:29:38.437911  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:38.466097  240007 main.go:143] libmachine: Using SSH client type: native
	I1201 20:29:38.466422  240007 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33033 <nil> <nil>}
	I1201 20:29:38.466429  240007 main.go:143] libmachine: About to run SSH command:
	hostname
	I1201 20:29:38.467059  240007 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:34232->127.0.0.1:33033: read: connection reset by peer
	I1201 20:29:41.621270  240007 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-916050
	
	I1201 20:29:41.621284  240007 ubuntu.go:182] provisioning hostname "pause-916050"
	I1201 20:29:41.621355  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:41.640354  240007 main.go:143] libmachine: Using SSH client type: native
	I1201 20:29:41.640661  240007 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33033 <nil> <nil>}
	I1201 20:29:41.640670  240007 main.go:143] libmachine: About to run SSH command:
	sudo hostname pause-916050 && echo "pause-916050" | sudo tee /etc/hostname
	I1201 20:29:41.798708  240007 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-916050
	
	I1201 20:29:41.798794  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:41.816210  240007 main.go:143] libmachine: Using SSH client type: native
	I1201 20:29:41.816525  240007 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33033 <nil> <nil>}
	I1201 20:29:41.816537  240007 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-916050' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-916050/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-916050' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1201 20:29:41.974008  240007 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1201 20:29:41.974022  240007 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/21997-2497/.minikube CaCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/21997-2497/.minikube}
	I1201 20:29:41.974049  240007 ubuntu.go:190] setting up certificates
	I1201 20:29:41.974057  240007 provision.go:84] configureAuth start
	I1201 20:29:41.974124  240007 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-916050
	I1201 20:29:41.991921  240007 provision.go:143] copyHostCerts
	I1201 20:29:41.991987  240007 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem, removing ...
	I1201 20:29:41.991994  240007 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem
	I1201 20:29:41.992073  240007 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/ca.pem (1078 bytes)
	I1201 20:29:41.992166  240007 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem, removing ...
	I1201 20:29:41.992170  240007 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem
	I1201 20:29:41.992194  240007 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/cert.pem (1123 bytes)
	I1201 20:29:41.992242  240007 exec_runner.go:144] found /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem, removing ...
	I1201 20:29:41.992245  240007 exec_runner.go:203] rm: /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem
	I1201 20:29:41.992269  240007 exec_runner.go:151] cp: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/21997-2497/.minikube/key.pem (1679 bytes)
	I1201 20:29:41.992329  240007 provision.go:117] generating server cert: /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem org=jenkins.pause-916050 san=[127.0.0.1 192.168.85.2 localhost minikube pause-916050]
	I1201 20:29:42.091788  240007 provision.go:177] copyRemoteCerts
	I1201 20:29:42.091864  240007 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1201 20:29:42.091914  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:42.111649  240007 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33033 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa Username:docker}
	I1201 20:29:42.222757  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1201 20:29:42.243359  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I1201 20:29:42.263438  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1201 20:29:42.283247  240007 provision.go:87] duration metric: took 309.169719ms to configureAuth
	I1201 20:29:42.283265  240007 ubuntu.go:206] setting minikube options for container-runtime
	I1201 20:29:42.283471  240007 config.go:182] Loaded profile config "pause-916050": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:29:42.283477  240007 machine.go:97] duration metric: took 3.845649533s to provisionDockerMachine
	I1201 20:29:42.283482  240007 client.go:176] duration metric: took 9.55226873s to LocalClient.Create
	I1201 20:29:42.283504  240007 start.go:167] duration metric: took 9.552332624s to libmachine.API.Create "pause-916050"
	I1201 20:29:42.283515  240007 start.go:293] postStartSetup for "pause-916050" (driver="docker")
	I1201 20:29:42.283523  240007 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1201 20:29:42.283576  240007 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1201 20:29:42.283615  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:42.302055  240007 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33033 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa Username:docker}
	I1201 20:29:42.406046  240007 ssh_runner.go:195] Run: cat /etc/os-release
	I1201 20:29:42.409367  240007 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1201 20:29:42.409385  240007 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1201 20:29:42.409395  240007 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/addons for local assets ...
	I1201 20:29:42.409448  240007 filesync.go:126] Scanning /home/jenkins/minikube-integration/21997-2497/.minikube/files for local assets ...
	I1201 20:29:42.409553  240007 filesync.go:149] local asset: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem -> 43052.pem in /etc/ssl/certs
	I1201 20:29:42.409651  240007 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1201 20:29:42.417020  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /etc/ssl/certs/43052.pem (1708 bytes)
	I1201 20:29:42.434857  240007 start.go:296] duration metric: took 151.329391ms for postStartSetup
	I1201 20:29:42.435250  240007 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-916050
	I1201 20:29:42.452619  240007 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/config.json ...
	I1201 20:29:42.455767  240007 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 20:29:42.455820  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:42.477146  240007 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33033 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa Username:docker}
	I1201 20:29:42.578595  240007 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1201 20:29:42.583353  240007 start.go:128] duration metric: took 9.855683026s to createHost
	I1201 20:29:42.583367  240007 start.go:83] releasing machines lock for "pause-916050", held for 9.855797686s
	I1201 20:29:42.583436  240007 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-916050
	I1201 20:29:42.607199  240007 ssh_runner.go:195] Run: cat /version.json
	I1201 20:29:42.607239  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:42.607767  240007 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1201 20:29:42.607813  240007 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-916050
	I1201 20:29:42.629065  240007 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33033 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa Username:docker}
	I1201 20:29:42.645548  240007 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33033 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/pause-916050/id_rsa Username:docker}
	I1201 20:29:42.745165  240007 ssh_runner.go:195] Run: systemctl --version
	I1201 20:29:42.837751  240007 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1201 20:29:42.842242  240007 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1201 20:29:42.842302  240007 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1201 20:29:42.869058  240007 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1201 20:29:42.869070  240007 start.go:496] detecting cgroup driver to use...
	I1201 20:29:42.869101  240007 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1201 20:29:42.869152  240007 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1201 20:29:42.884762  240007 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1201 20:29:42.898075  240007 docker.go:218] disabling cri-docker service (if available) ...
	I1201 20:29:42.898125  240007 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1201 20:29:42.915668  240007 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1201 20:29:42.934382  240007 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1201 20:29:43.046963  240007 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1201 20:29:43.172383  240007 docker.go:234] disabling docker service ...
	I1201 20:29:43.172445  240007 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1201 20:29:43.195892  240007 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1201 20:29:43.208874  240007 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1201 20:29:43.321542  240007 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1201 20:29:43.454579  240007 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1201 20:29:43.467887  240007 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1201 20:29:43.483730  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1201 20:29:43.492861  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1201 20:29:43.502132  240007 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1201 20:29:43.502220  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1201 20:29:43.511559  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 20:29:43.520286  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1201 20:29:43.530234  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1201 20:29:43.539342  240007 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1201 20:29:43.547513  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1201 20:29:43.556545  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1201 20:29:43.565170  240007 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1201 20:29:43.574020  240007 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1201 20:29:43.581664  240007 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1201 20:29:43.588891  240007 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 20:29:43.697868  240007 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1201 20:29:43.823900  240007 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1201 20:29:43.823961  240007 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1201 20:29:43.828106  240007 start.go:564] Will wait 60s for crictl version
	I1201 20:29:43.828157  240007 ssh_runner.go:195] Run: which crictl
	I1201 20:29:43.831731  240007 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1201 20:29:43.858522  240007 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1201 20:29:43.858596  240007 ssh_runner.go:195] Run: containerd --version
	I1201 20:29:43.880179  240007 ssh_runner.go:195] Run: containerd --version
	I1201 20:29:43.903471  240007 out.go:179] * Preparing Kubernetes v1.34.2 on containerd 2.1.5 ...
	I1201 20:29:43.906624  240007 cli_runner.go:164] Run: docker network inspect pause-916050 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1201 20:29:43.923814  240007 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1201 20:29:43.928806  240007 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.85.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1201 20:29:43.939930  240007 kubeadm.go:884] updating cluster {Name:pause-916050 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-916050 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerName
s:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePat
h: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1201 20:29:43.940037  240007 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1201 20:29:43.940102  240007 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 20:29:43.965533  240007 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 20:29:43.965545  240007 containerd.go:534] Images already preloaded, skipping extraction
	I1201 20:29:43.965608  240007 ssh_runner.go:195] Run: sudo crictl images --output json
	I1201 20:29:43.990340  240007 containerd.go:627] all images are preloaded for containerd runtime.
	I1201 20:29:43.990352  240007 cache_images.go:86] Images are preloaded, skipping loading
	I1201 20:29:43.990363  240007 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.34.2 containerd true true} ...
	I1201 20:29:43.990457  240007 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=pause-916050 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.2 ClusterName:pause-916050 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1201 20:29:43.990517  240007 ssh_runner.go:195] Run: sudo crictl info
	I1201 20:29:44.020855  240007 cni.go:84] Creating CNI manager for ""
	I1201 20:29:44.020867  240007 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 20:29:44.020879  240007 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1201 20:29:44.020902  240007 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.34.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-916050 NodeName:pause-916050 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1201 20:29:44.021022  240007 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "pause-916050"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1201 20:29:44.021091  240007 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.2
	I1201 20:29:44.029761  240007 binaries.go:51] Found k8s binaries, skipping transfer
	I1201 20:29:44.029824  240007 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1201 20:29:44.038912  240007 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (316 bytes)
	I1201 20:29:44.052987  240007 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1201 20:29:44.067358  240007 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2225 bytes)
	I1201 20:29:44.081818  240007 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1201 20:29:44.087002  240007 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.85.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1201 20:29:44.098068  240007 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 20:29:44.226339  240007 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 20:29:44.242710  240007 certs.go:69] Setting up /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050 for IP: 192.168.85.2
	I1201 20:29:44.242721  240007 certs.go:195] generating shared ca certs ...
	I1201 20:29:44.242736  240007 certs.go:227] acquiring lock for ca certs: {Name:mk44a77eee505d9292fa413ae7abec1c290fec42 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.242954  240007 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key
	I1201 20:29:44.243013  240007 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key
	I1201 20:29:44.243019  240007 certs.go:257] generating profile certs ...
	I1201 20:29:44.243084  240007 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/client.key
	I1201 20:29:44.243098  240007 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/client.crt with IP's: []
	I1201 20:29:44.386268  240007 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/client.crt ...
	I1201 20:29:44.386286  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/client.crt: {Name:mk7bf0d19c8d1de34eae506c81249e7c9eff3e02 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.386483  240007 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/client.key ...
	I1201 20:29:44.386489  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/client.key: {Name:mk82d552a8c919ea260d8bc7bbf8c52fa9874db8 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.386577  240007 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.key.ca1c7ef0
	I1201 20:29:44.386590  240007 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.crt.ca1c7ef0 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.85.2]
	I1201 20:29:44.652061  240007 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.crt.ca1c7ef0 ...
	I1201 20:29:44.652077  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.crt.ca1c7ef0: {Name:mka5549d54e64cd313f74e0a6c5d8ae98738ff85 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.652264  240007 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.key.ca1c7ef0 ...
	I1201 20:29:44.652272  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.key.ca1c7ef0: {Name:mk9e9b4f5d482faef1623817a801c98c75b56c99 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.652356  240007 certs.go:382] copying /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.crt.ca1c7ef0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.crt
	I1201 20:29:44.652425  240007 certs.go:386] copying /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.key.ca1c7ef0 -> /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.key
	I1201 20:29:44.652490  240007 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.key
	I1201 20:29:44.652501  240007 crypto.go:68] Generating cert /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.crt with IP's: []
	I1201 20:29:44.942294  240007 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.crt ...
	I1201 20:29:44.942309  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.crt: {Name:mk15c17c76207963e1f1da815a0ed73fc5b2223a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.942491  240007 crypto.go:164] Writing key to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.key ...
	I1201 20:29:44.942497  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.key: {Name:mk42cfc5261788564dfbddb43d84f5f0899e6cea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:29:44.942682  240007 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem (1338 bytes)
	W1201 20:29:44.942724  240007 certs.go:480] ignoring /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305_empty.pem, impossibly tiny 0 bytes
	I1201 20:29:44.942730  240007 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca-key.pem (1675 bytes)
	I1201 20:29:44.942929  240007 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/ca.pem (1078 bytes)
	I1201 20:29:44.942961  240007 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/cert.pem (1123 bytes)
	I1201 20:29:44.942984  240007 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/certs/key.pem (1679 bytes)
	I1201 20:29:44.943058  240007 certs.go:484] found cert: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem (1708 bytes)
	I1201 20:29:44.944279  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1201 20:29:44.962852  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I1201 20:29:44.982481  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1201 20:29:45.001815  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1201 20:29:45.082779  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1201 20:29:45.148755  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1201 20:29:45.173616  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1201 20:29:45.201209  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/pause-916050/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1201 20:29:45.242875  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/certs/4305.pem --> /usr/share/ca-certificates/4305.pem (1338 bytes)
	I1201 20:29:45.276002  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/ssl/certs/43052.pem --> /usr/share/ca-certificates/43052.pem (1708 bytes)
	I1201 20:29:45.301585  240007 ssh_runner.go:362] scp /home/jenkins/minikube-integration/21997-2497/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1201 20:29:45.331075  240007 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1201 20:29:45.354627  240007 ssh_runner.go:195] Run: openssl version
	I1201 20:29:45.361543  240007 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4305.pem && ln -fs /usr/share/ca-certificates/4305.pem /etc/ssl/certs/4305.pem"
	I1201 20:29:45.370784  240007 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4305.pem
	I1201 20:29:45.375096  240007 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  1 19:17 /usr/share/ca-certificates/4305.pem
	I1201 20:29:45.375153  240007 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4305.pem
	I1201 20:29:45.422650  240007 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4305.pem /etc/ssl/certs/51391683.0"
	I1201 20:29:45.432579  240007 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/43052.pem && ln -fs /usr/share/ca-certificates/43052.pem /etc/ssl/certs/43052.pem"
	I1201 20:29:45.440900  240007 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/43052.pem
	I1201 20:29:45.444654  240007 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  1 19:17 /usr/share/ca-certificates/43052.pem
	I1201 20:29:45.444718  240007 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/43052.pem
	I1201 20:29:45.488731  240007 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/43052.pem /etc/ssl/certs/3ec20f2e.0"
	I1201 20:29:45.497736  240007 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1201 20:29:45.506157  240007 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1201 20:29:45.510241  240007 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  1 19:07 /usr/share/ca-certificates/minikubeCA.pem
	I1201 20:29:45.510299  240007 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1201 20:29:45.551674  240007 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1201 20:29:45.560145  240007 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1201 20:29:45.563941  240007 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1201 20:29:45.563986  240007 kubeadm.go:401] StartCluster: {Name:pause-916050 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-916050 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[
] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath:
SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 20:29:45.564063  240007 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1201 20:29:45.564124  240007 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1201 20:29:45.596206  240007 cri.go:89] found id: ""
	I1201 20:29:45.596267  240007 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1201 20:29:45.607863  240007 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1201 20:29:45.619945  240007 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1201 20:29:45.619999  240007 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1201 20:29:45.630709  240007 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1201 20:29:45.630717  240007 kubeadm.go:158] found existing configuration files:
	
	I1201 20:29:45.630766  240007 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1201 20:29:45.641091  240007 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1201 20:29:45.641147  240007 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1201 20:29:45.651742  240007 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1201 20:29:45.659384  240007 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1201 20:29:45.659440  240007 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1201 20:29:45.666853  240007 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1201 20:29:45.674603  240007 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1201 20:29:45.674659  240007 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1201 20:29:45.682063  240007 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1201 20:29:45.689591  240007 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1201 20:29:45.689652  240007 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1201 20:29:45.697380  240007 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1201 20:29:45.761919  240007 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1201 20:29:45.762141  240007 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 20:29:45.838522  240007 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 20:30:01.502155  240007 kubeadm.go:319] [init] Using Kubernetes version: v1.34.2
	I1201 20:30:01.502205  240007 kubeadm.go:319] [preflight] Running pre-flight checks
	I1201 20:30:01.502292  240007 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1201 20:30:01.502348  240007 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1201 20:30:01.502380  240007 kubeadm.go:319] OS: Linux
	I1201 20:30:01.502431  240007 kubeadm.go:319] CGROUPS_CPU: enabled
	I1201 20:30:01.502478  240007 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1201 20:30:01.502525  240007 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1201 20:30:01.502582  240007 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1201 20:30:01.502629  240007 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1201 20:30:01.502680  240007 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1201 20:30:01.502724  240007 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1201 20:30:01.502777  240007 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1201 20:30:01.502848  240007 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1201 20:30:01.502926  240007 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1201 20:30:01.503022  240007 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1201 20:30:01.503112  240007 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1201 20:30:01.503173  240007 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1201 20:30:01.506536  240007 out.go:252]   - Generating certificates and keys ...
	I1201 20:30:01.506678  240007 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1201 20:30:01.506751  240007 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1201 20:30:01.506830  240007 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1201 20:30:01.506911  240007 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1201 20:30:01.506984  240007 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1201 20:30:01.507050  240007 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1201 20:30:01.507118  240007 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1201 20:30:01.507242  240007 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [localhost pause-916050] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1201 20:30:01.507297  240007 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1201 20:30:01.507419  240007 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [localhost pause-916050] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1201 20:30:01.507488  240007 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1201 20:30:01.507562  240007 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1201 20:30:01.507609  240007 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1201 20:30:01.507670  240007 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1201 20:30:01.507724  240007 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1201 20:30:01.507783  240007 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1201 20:30:01.507840  240007 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1201 20:30:01.507907  240007 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1201 20:30:01.507964  240007 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1201 20:30:01.508050  240007 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1201 20:30:01.508120  240007 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1201 20:30:01.511349  240007 out.go:252]   - Booting up control plane ...
	I1201 20:30:01.511454  240007 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1201 20:30:01.511543  240007 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1201 20:30:01.511621  240007 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1201 20:30:01.511731  240007 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1201 20:30:01.511843  240007 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1201 20:30:01.511955  240007 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1201 20:30:01.512045  240007 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1201 20:30:01.512086  240007 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1201 20:30:01.512230  240007 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1201 20:30:01.512340  240007 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1201 20:30:01.512402  240007 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 2.001304655s
	I1201 20:30:01.512499  240007 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1201 20:30:01.512583  240007 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.85.2:8443/livez
	I1201 20:30:01.512677  240007 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1201 20:30:01.512759  240007 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1201 20:30:01.512838  240007 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 2.49470696s
	I1201 20:30:01.512909  240007 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 4.287577564s
	I1201 20:30:01.512995  240007 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 6.003796963s
	I1201 20:30:01.513107  240007 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1201 20:30:01.513243  240007 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1201 20:30:01.513303  240007 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1201 20:30:01.513690  240007 kubeadm.go:319] [mark-control-plane] Marking the node pause-916050 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1201 20:30:01.513760  240007 kubeadm.go:319] [bootstrap-token] Using token: bzl2bm.gdr1lcemckp77oa1
	I1201 20:30:01.516666  240007 out.go:252]   - Configuring RBAC rules ...
	I1201 20:30:01.516816  240007 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1201 20:30:01.516911  240007 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1201 20:30:01.517071  240007 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1201 20:30:01.517217  240007 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1201 20:30:01.517354  240007 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1201 20:30:01.517589  240007 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1201 20:30:01.517725  240007 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1201 20:30:01.517778  240007 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1201 20:30:01.517823  240007 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1201 20:30:01.517826  240007 kubeadm.go:319] 
	I1201 20:30:01.517907  240007 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1201 20:30:01.517912  240007 kubeadm.go:319] 
	I1201 20:30:01.517995  240007 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1201 20:30:01.517998  240007 kubeadm.go:319] 
	I1201 20:30:01.518022  240007 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1201 20:30:01.518087  240007 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1201 20:30:01.518146  240007 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1201 20:30:01.518149  240007 kubeadm.go:319] 
	I1201 20:30:01.518202  240007 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1201 20:30:01.518205  240007 kubeadm.go:319] 
	I1201 20:30:01.518258  240007 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1201 20:30:01.518261  240007 kubeadm.go:319] 
	I1201 20:30:01.518317  240007 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1201 20:30:01.518399  240007 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1201 20:30:01.518468  240007 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1201 20:30:01.518482  240007 kubeadm.go:319] 
	I1201 20:30:01.518617  240007 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1201 20:30:01.518694  240007 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1201 20:30:01.518697  240007 kubeadm.go:319] 
	I1201 20:30:01.518781  240007 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token bzl2bm.gdr1lcemckp77oa1 \
	I1201 20:30:01.518886  240007 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:ab4a5777ea46932c6cc337090198b4c8417ab7ccf1b0a13e3c164c6515145a32 \
	I1201 20:30:01.518905  240007 kubeadm.go:319] 	--control-plane 
	I1201 20:30:01.518909  240007 kubeadm.go:319] 
	I1201 20:30:01.519001  240007 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1201 20:30:01.519005  240007 kubeadm.go:319] 
	I1201 20:30:01.519086  240007 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token bzl2bm.gdr1lcemckp77oa1 \
	I1201 20:30:01.519188  240007 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:ab4a5777ea46932c6cc337090198b4c8417ab7ccf1b0a13e3c164c6515145a32 
	I1201 20:30:01.519213  240007 cni.go:84] Creating CNI manager for ""
	I1201 20:30:01.519220  240007 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 20:30:01.522500  240007 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1201 20:30:01.525597  240007 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1201 20:30:01.536492  240007 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.2/kubectl ...
	I1201 20:30:01.536503  240007 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I1201 20:30:01.562965  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1201 20:30:01.961954  240007 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1201 20:30:01.962098  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:01.962188  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes pause-916050 minikube.k8s.io/updated_at=2025_12_01T20_30_01_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=3ab9e66fb642a86710fef1e3147732f1580938c9 minikube.k8s.io/name=pause-916050 minikube.k8s.io/primary=true
	I1201 20:30:01.974198  240007 ops.go:34] apiserver oom_adj: -16
	I1201 20:30:02.084140  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:02.584608  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:03.085031  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:03.584226  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:04.084234  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:04.585075  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:05.085105  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:05.584361  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:06.084196  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:06.585190  240007 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1201 20:30:06.707791  240007 kubeadm.go:1114] duration metric: took 4.74575633s to wait for elevateKubeSystemPrivileges
	I1201 20:30:06.707810  240007 kubeadm.go:403] duration metric: took 21.143828771s to StartCluster
	I1201 20:30:06.707825  240007 settings.go:142] acquiring lock: {Name:mk0c68be267fd1e06eeb79721201896d000b433c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:30:06.707899  240007 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 20:30:06.708815  240007 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/21997-2497/kubeconfig: {Name:mkdb60af0cfe5fe9d8057697c65ddbe5a2224835 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1201 20:30:06.708992  240007 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1201 20:30:06.709071  240007 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1201 20:30:06.709299  240007 config.go:182] Loaded profile config "pause-916050": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:30:06.712515  240007 out.go:179] * Verifying Kubernetes components...
	I1201 20:30:06.715657  240007 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1201 20:30:06.999865  240007 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.85.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1201 20:30:07.043149  240007 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1201 20:30:07.295289  240007 start.go:977] {"host.minikube.internal": 192.168.85.1} host record injected into CoreDNS's ConfigMap
	I1201 20:30:07.298731  240007 node_ready.go:35] waiting up to 6m0s for node "pause-916050" to be "Ready" ...
	I1201 20:30:07.805432  240007 kapi.go:214] "coredns" deployment in "kube-system" namespace and "pause-916050" context rescaled to 1 replicas
	W1201 20:30:09.301543  240007 node_ready.go:57] node "pause-916050" has "Ready":"False" status (will retry)
	W1201 20:30:11.801631  240007 node_ready.go:57] node "pause-916050" has "Ready":"False" status (will retry)
	I1201 20:30:13.324007  199924 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000714482s
	I1201 20:30:13.324103  199924 kubeadm.go:319] 
	I1201 20:30:13.324197  199924 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1201 20:30:13.324268  199924 kubeadm.go:319] 	- The kubelet is not running
	I1201 20:30:13.324416  199924 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1201 20:30:13.324455  199924 kubeadm.go:319] 
	I1201 20:30:13.324597  199924 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1201 20:30:13.324655  199924 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1201 20:30:13.324713  199924 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1201 20:30:13.324738  199924 kubeadm.go:319] 
	I1201 20:30:13.328109  199924 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1201 20:30:13.328573  199924 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1201 20:30:13.328702  199924 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1201 20:30:13.328942  199924 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1201 20:30:13.328948  199924 kubeadm.go:319] 
	I1201 20:30:13.329017  199924 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1201 20:30:13.329075  199924 kubeadm.go:403] duration metric: took 12m9.776495224s to StartCluster
	I1201 20:30:13.329110  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1201 20:30:13.329170  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1201 20:30:13.354820  199924 cri.go:89] found id: ""
	I1201 20:30:13.354843  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.354852  199924 logs.go:284] No container was found matching "kube-apiserver"
	I1201 20:30:13.354859  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1201 20:30:13.354920  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1201 20:30:13.387756  199924 cri.go:89] found id: ""
	I1201 20:30:13.387835  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.387857  199924 logs.go:284] No container was found matching "etcd"
	I1201 20:30:13.387875  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1201 20:30:13.387957  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1201 20:30:13.413213  199924 cri.go:89] found id: ""
	I1201 20:30:13.413235  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.413243  199924 logs.go:284] No container was found matching "coredns"
	I1201 20:30:13.413250  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1201 20:30:13.413310  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1201 20:30:13.452434  199924 cri.go:89] found id: ""
	I1201 20:30:13.452499  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.452531  199924 logs.go:284] No container was found matching "kube-scheduler"
	I1201 20:30:13.452553  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1201 20:30:13.452664  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1201 20:30:13.479043  199924 cri.go:89] found id: ""
	I1201 20:30:13.479068  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.479078  199924 logs.go:284] No container was found matching "kube-proxy"
	I1201 20:30:13.479085  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1201 20:30:13.479145  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1201 20:30:13.504793  199924 cri.go:89] found id: ""
	I1201 20:30:13.504828  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.504837  199924 logs.go:284] No container was found matching "kube-controller-manager"
	I1201 20:30:13.504861  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1201 20:30:13.504943  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1201 20:30:13.532075  199924 cri.go:89] found id: ""
	I1201 20:30:13.532107  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.532116  199924 logs.go:284] No container was found matching "kindnet"
	I1201 20:30:13.532139  199924 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1201 20:30:13.532219  199924 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1201 20:30:13.558878  199924 cri.go:89] found id: ""
	I1201 20:30:13.558945  199924 logs.go:282] 0 containers: []
	W1201 20:30:13.558967  199924 logs.go:284] No container was found matching "storage-provisioner"
	I1201 20:30:13.558991  199924 logs.go:123] Gathering logs for kubelet ...
	I1201 20:30:13.559009  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1201 20:30:13.618214  199924 logs.go:123] Gathering logs for dmesg ...
	I1201 20:30:13.618256  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1201 20:30:13.636942  199924 logs.go:123] Gathering logs for describe nodes ...
	I1201 20:30:13.637021  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1201 20:30:13.713742  199924 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1201 20:30:13.713810  199924 logs.go:123] Gathering logs for containerd ...
	I1201 20:30:13.713849  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1201 20:30:13.752642  199924 logs.go:123] Gathering logs for container status ...
	I1201 20:30:13.752676  199924 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1201 20:30:13.786390  199924 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1201 20:30:13.786444  199924 out.go:285] * 
	W1201 20:30:13.786510  199924 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 20:30:13.786528  199924 out.go:285] * 
	W1201 20:30:13.788983  199924 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1201 20:30:13.794629  199924 out.go:203] 
	W1201 20:30:13.797610  199924 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000714482s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1201 20:30:13.797667  199924 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1201 20:30:13.797690  199924 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1201 20:30:13.801069  199924 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.594197180Z" level=info msg="StopPodSandbox for \"6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d\" returns successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.594567219Z" level=info msg="RemovePodSandbox for \"6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.594606563Z" level=info msg="Forcibly stopping sandbox \"6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.594641624Z" level=info msg="Container to stop \"f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.594991642Z" level=info msg="TearDown network for sandbox \"6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d\" successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.603136468Z" level=info msg="Ensure that sandbox 6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d in task-service has been cleanup successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.609978692Z" level=info msg="RemovePodSandbox \"6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d\" returns successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.610502300Z" level=info msg="StopPodSandbox for \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.610578067Z" level=info msg="Container to stop \"95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.610976036Z" level=info msg="TearDown network for sandbox \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\" successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.611029797Z" level=info msg="StopPodSandbox for \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\" returns successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.611442765Z" level=info msg="RemovePodSandbox for \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.611482823Z" level=info msg="Forcibly stopping sandbox \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.611517154Z" level=info msg="Container to stop \"95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.611876321Z" level=info msg="TearDown network for sandbox \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\" successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.620082563Z" level=info msg="Ensure that sandbox 70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208 in task-service has been cleanup successfully"
	Dec 01 20:22:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:22:08.626002003Z" level=info msg="RemovePodSandbox \"70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208\" returns successfully"
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.569553387Z" level=info msg="container event discarded" container=2ceedc38853c4c82468dd709fd483e296915c26ad5d5fd15488b196998ba2a52 type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.584880306Z" level=info msg="container event discarded" container=d61116b853932193b5ae756e6561ddd63eb71c79c6e5919a0ae3bf9ab3d1f8a2 type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.596182379Z" level=info msg="container event discarded" container=f6c08930d254bf6a7f78cd2402f59aa0b9832f679c26714f4e4a141ed877eb26 type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.596243754Z" level=info msg="container event discarded" container=07a23cfbbfd05cba48a3fba571b57c8a39e980a80ffeaa0758c2f51239c558b0 type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.613618272Z" level=info msg="container event discarded" container=f37bc0607ad7d0a39ea9634cbdb945a6266c9536f2262a418c05d32f9d509129 type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.613677358Z" level=info msg="container event discarded" container=6e679859e63bf37eac3d11e98c3797487b465af32d635a4e097fbfafc6b1e85d type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.629896145Z" level=info msg="container event discarded" container=95107b0ab6c96885841377eee729d17a739da2aaecdd8331c3efa96404be630a type=CONTAINER_DELETED_EVENT
	Dec 01 20:27:08 kubernetes-upgrade-846544 containerd[556]: time="2025-12-01T20:27:08.629951629Z" level=info msg="container event discarded" container=70c3edcdca7e9916a7615fecf83940ff7d95654121a6fffaf472c1a12c6d8208 type=CONTAINER_DELETED_EVENT
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 1 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015295] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.547776] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.034333] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.774491] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.932193] kauditd_printk_skb: 36 callbacks suppressed
	[Dec 1 19:51] hrtimer: interrupt took 26509791 ns
	
	
	==> kernel <==
	 20:30:15 up  2:12,  0 user,  load average: 1.95, 1.47, 1.74
	Linux kubernetes-upgrade-846544 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 01 20:30:12 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 20:30:12 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 01 20:30:12 kubernetes-upgrade-846544 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:12 kubernetes-upgrade-846544 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:12 kubernetes-upgrade-846544 kubelet[14449]: E1201 20:30:12.897790   14449 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 20:30:12 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 20:30:12 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 20:30:13 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 01 20:30:13 kubernetes-upgrade-846544 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:13 kubernetes-upgrade-846544 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:13 kubernetes-upgrade-846544 kubelet[14518]: E1201 20:30:13.666357   14518 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 20:30:13 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 20:30:13 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 20:30:14 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 01 20:30:14 kubernetes-upgrade-846544 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:14 kubernetes-upgrade-846544 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:14 kubernetes-upgrade-846544 kubelet[14547]: E1201 20:30:14.450743   14547 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 20:30:14 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 20:30:14 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 01 20:30:15 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 01 20:30:15 kubernetes-upgrade-846544 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:15 kubernetes-upgrade-846544 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 01 20:30:15 kubernetes-upgrade-846544 kubelet[14567]: E1201 20:30:15.191291   14567 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 01 20:30:15 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 01 20:30:15 kubernetes-upgrade-846544 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-846544 -n kubernetes-upgrade-846544
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-846544 -n kubernetes-upgrade-846544: exit status 2 (372.563317ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "kubernetes-upgrade-846544" apiserver is not running, skipping kubectl commands (state="Stopped")
helpers_test.go:175: Cleaning up "kubernetes-upgrade-846544" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p kubernetes-upgrade-846544
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p kubernetes-upgrade-846544: (2.246479121s)
--- FAIL: TestKubernetesUpgrade (794.27s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (7200.175s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:285: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:03:06.115932    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kindnet-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:03:23.311226    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/old-k8s-version-022251/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:03:26.597614    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kindnet-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:03:39.681023    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/auto-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
I1201 21:03:56.453643    4305 config.go:182] Loaded profile config "flannel-001463": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:07.559305    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/kindnet-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:20.211374    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 21:04:20.217857    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 21:04:20.232109    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 21:04:20.254032    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 21:04:20.295334    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 21:04:20.377068    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:20.538555    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 21:04:20.860661    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:21.502417    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:22.784362    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:25.346268    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:30.468444    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:40.710414    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:04:52.599292    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1201 21:05:01.192083    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/calico-001463/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
panic: test timed out after 2h0m0s
	running tests:
		TestNetworkPlugins (34m19s)
		TestNetworkPlugins/group/bridge (49s)
		TestNetworkPlugins/group/bridge/NetCatPod (2s)
		TestStartStop (35m44s)
		TestStartStop/group/no-preload (27m41s)
		TestStartStop/group/no-preload/serial (27m41s)
		TestStartStop/group/no-preload/serial/AddonExistsAfterStop (2m12s)

                                                
                                                
goroutine 6387 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2682 +0x2b0
created by time.goFunc
	/usr/local/go/src/time/sleep.go:215 +0x38

                                                
                                                
goroutine 1 [chan receive, 30 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4000334fc0, 0x4000661bb8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
testing.runTests(0x400073e270, {0x534c580, 0x2c, 0x2c}, {0x4000661d08?, 0x125774?, 0x5374f80?})
	/usr/local/go/src/testing/testing.go:2475 +0x3b8
testing.(*M).Run(0x40004f7040)
	/usr/local/go/src/testing/testing.go:2337 +0x530
k8s.io/minikube/test/integration.TestMain(0x40004f7040)
	/home/jenkins/workspace/Build_Cross/test/integration/main_test.go:64 +0xf0
main.main()
	_testmain.go:133 +0x88

                                                
                                                
goroutine 6073 [chan receive, 2 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019bba40, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 6071
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1534 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x40014e6300?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1533
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 155 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x40017c0540?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 154
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3261 [chan receive, 35 minutes]:
testing.(*T).Run(0x40019e0700, {0x296d53a?, 0x4001d87f58?}, 0x339b730)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop(0x40019e0700)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:46 +0x3c
testing.tRunner(0x40019e0700, 0x339b548)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1561 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40019c7450, 0x24)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019c7440)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013eccc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40003ef420?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001538f38, {0x369d680, 0x4001c47470}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x4001c47470?}, 0x80?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40012fd110, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1535
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1563 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1562
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 161 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40019ac550, 0x2d)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019ac540)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40004ef020)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40017c85b0?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x40000d5f38, {0x369d680, 0x40017b0030}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x369d680?, 0x40017b0030?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001796000, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 156
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3197 [chan receive, 35 minutes]:
testing.(*T).Run(0x400146a000, {0x296d53a?, 0x6409613372c?}, 0x40017a1518)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins(0x400146a000)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:52 +0xe4
testing.tRunner(0x400146a000, 0x339b500)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3513 [chan receive, 8 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4001a2b500, 0x40017a1518)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3197
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 6364 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 6363
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 163 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 162
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 162 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x40000d4f40, 0x40000d4f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x84?, 0x40000d4f40, 0x40000d4f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x161f90?, 0x40000a0f50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000294000?, 0x40017c0540?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 156
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 156 [chan receive, 116 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40004ef020, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 154
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1535 [chan receive, 80 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013eccc0, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1533
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4266 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40007f46d0, 0x2)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40007f46c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013ed080)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x53a3160?, 0x2a0ac?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0xffff937e7108?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4000663f38, {0x369d680, 0x40006e4000}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x40006e4000?}, 0x30?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001479480, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4263
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3897 [chan receive, 30 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013c8cc0, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3920
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 872 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0x4001ca7e90, 0x2b)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001ca7e80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013ed020)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40017c9180?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x400139eea8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x400153af38, {0x369d680, 0x4000649e30}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x400139efa8?, {0x369d680?, 0x4000649e30?}, 0xa0?, 0x4001a46600?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4000692890, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 889
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 889 [chan receive, 110 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013ed020, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 887
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 6072 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x400184ae00?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 6071
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1316 [IO wait, 108 minutes]:
internal/poll.runtime_pollWait(0xffff4cd1d600, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x40016a7e80?, 0xdbd0c?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x40016a7e80)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x40016a7e80)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x4001833ec0)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x4001833ec0)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x4001695800, {0x36d3120, 0x4001833ec0})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x4001695800)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 1314
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 4877 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4876
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 679 [IO wait, 112 minutes]:
internal/poll.runtime_pollWait(0xffff4cd1e000, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001920580?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x4001920580)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x4001920580)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x4001ca6c40)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x4001ca6c40)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x400018c900, {0x36d3120, 0x4001ca6c40})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x400018c900)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 677
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 5972 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x36e5708, 0x4001aea4b0}, {0x36d3780, 0x400191a8c0}, 0x1, 0x0, 0x400131bb00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x36e5778?, 0x400046d7a0?}, 0x3b9aca00, 0x400131bd28?, 0x1, 0x400131bb00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x36e5778, 0x400046d7a0}, 0x400184a700, {0x4001c593e0, 0x11}, {0x2993faf, 0x14}, {0x29abe76, 0x1c}, 0x7dba821800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:379 +0x22c
k8s.io/minikube/test/integration.validateAddonAfterStop({0x36e5778, 0x400046d7a0}, 0x400184a700, {0x4001c593e0, 0x11}, {0x297850c?, 0x1781ce6600161e84?}, {0x692e0288?, 0x400136ef58?}, {0x161f08?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:285 +0xd4
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x400184a700?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x400184a700, 0x4001524000)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4072
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3465 [chan receive, 28 minutes]:
testing.(*T).Run(0x4001a2aa80, {0x296e9ac?, 0x0?}, 0x400036f000)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x4001a2aa80)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x4001a2aa80, 0x40017ba280)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3461
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4560 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40007f4110, 0x10)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40007f4100)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40004eee40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001338688?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x40003713c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001315f38, {0x369d680, 0x400059a480}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369d680?, 0x400059a480?}, 0x60?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40019aeeb0, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4557
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1120 [chan send, 108 minutes]:
os/exec.(*Cmd).watchCtx(0x4001807b00, 0x40017e1500)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1119
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 873 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x4001336740, 0x4001329f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x1c?, 0x4001336740, 0x4001336788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x0?, 0x4001336750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000294000?, 0x4001462300?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 889
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 6363 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x4001338740, 0x4001338788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x0?, 0x4001338740, 0x4001338788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x36e5778?, 0x4001c50af0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x4001c50a10?, 0x0?, 0x40013b5e00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6359
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5810 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5809
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3900 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0x4001ca6650, 0x16)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001ca6640)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013c8cc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4000084690?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x400170eef8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x400136af38, {0x369d680, 0x4001a599e0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x369d680?, 0x4001a599e0?}, 0x40?, 0x369e300?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40012fde70, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3897
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1092 [chan send, 108 minutes]:
os/exec.(*Cmd).watchCtx(0x4001717c80, 0x4001715570)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 817
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5792 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0x40019ac9d0, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019ac9c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019025a0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40017151f0?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x40013a56a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001d82f38, {0x369d680, 0x40012efa40}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369d680?, 0x40012efa40?}, 0x90?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400190d5a0, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5789
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4072 [chan receive, 2 minutes]:
testing.(*T).Run(0x4001a2afc0, {0x2993fff?, 0x40000006ee?}, 0x4001524000)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x4001a2afc0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x4001a2afc0, 0x400036f000)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3465
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4556 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4001a2ba40?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4552
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4561 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x40000a2740, 0x40000a2788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x0?, 0x40000a2740, 0x40000a2788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x36e5778?, 0x4001714af0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x4001714a10?, 0x0?, 0x4001460000?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4557
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 874 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 873
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1127 [select, 108 minutes]:
net/http.(*persistConn).writeLoop(0x40018207e0)
	/usr/local/go/src/net/http/transport.go:2600 +0x94
created by net/http.(*Transport).dialConn in goroutine 1124
	/usr/local/go/src/net/http/transport.go:1948 +0x1164

                                                
                                                
goroutine 1049 [chan send, 108 minutes]:
os/exec.(*Cmd).watchCtx(0x4001799680, 0x40017e0000)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1048
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1126 [select, 108 minutes]:
net/http.(*persistConn).readLoop(0x40018207e0)
	/usr/local/go/src/net/http/transport.go:2398 +0xa6c
created by net/http.(*Transport).dialConn in goroutine 1124
	/usr/local/go/src/net/http/transport.go:1947 +0x111c

                                                
                                                
goroutine 3803 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3802
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3901 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x400133d740, 0x4001d88f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x6c?, 0x400133d740, 0x400133d788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x0?, 0x400133d750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000294000?, 0x4001590180?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3897
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5179 [chan receive, 6 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013c8840, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5177
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 6076 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40019acc90, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019acc80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019bba40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40016467e0?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x400133c6b8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001535f38, {0x369d680, 0x400179e3c0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x400133c788?, {0x369d680?, 0x400179e3c0?}, 0x1?, 0x36e5778?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400158b3c0, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6073
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4268 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4267
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5512 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x400184ae00?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5501
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 2022 [chan send, 78 minutes]:
os/exec.(*Cmd).watchCtx(0x400070d200, 0x4001844c40)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1492
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1562 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x40000a3f40, 0x4001313f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x30?, 0x40000a3f40, 0x40000a3f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x40014e6900?, 0x4000001040?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40014e7200?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1535
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3801 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40019ac6d0, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019ac6c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40015d0c60)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400030d420?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001324f38, {0x369d680, 0x4001593650}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x4001593650?}, 0xc0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40006920d0, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3798
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5513 [chan receive, 4 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001962fc0, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5501
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1935 [chan send, 78 minutes]:
os/exec.(*Cmd).watchCtx(0x400032d680, 0x4001c1bf80)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1934
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3461 [chan receive, 10 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4001a2a000, 0x339b730)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3261
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1884 [chan send, 78 minutes]:
os/exec.(*Cmd).watchCtx(0x40014e6300, 0x4001646620)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1883
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 4876 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x400133ef40, 0x400133ef88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x2f?, 0x400133ef40, 0x400133ef88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x0?, 0x400133ef50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000294000?, 0x4001a2b340?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4864
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 888 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4001462300?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 887
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4562 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4561
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4863 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4001a2b340?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4862
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 6359 [chan receive]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001c4d1a0, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 6371
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5183 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x4001331740, 0x4001331788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x1c?, 0x4001331740, 0x4001331788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x0?, 0x4001331750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000294000?, 0x400184a1c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5179
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5178 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x400184a1c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5177
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4875 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x4001ca6e90, 0xf)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001ca6e80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001a67020)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40018443f0?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001322f38, {0x369d680, 0x4001bff260}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369d680?, 0x4001bff260?}, 0x60?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001ae1e90, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4864
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3896 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4001590180?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3920
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5518 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5517
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 6371 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x36e5778, 0x40003f56c0}, {0x36d3780, 0x40017c56a0}, 0x1, 0x0, 0x4001309ba0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x36e5778?, 0x4000387d50?}, 0x3b9aca00, 0x4001309dc8?, 0x1, 0x4001309ba0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x36e5778, 0x4000387d50}, 0x4000335180, {0x4001c44d70, 0xd}, {0x297122c, 0x7}, {0x2978476, 0xa}, 0xd18c2e2800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:379 +0x22c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1.4(0x4000335180)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:163 +0x2a0
testing.tRunner(0x4000335180, 0x4001c17aa0)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3582
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4263 [chan receive, 11 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013ed080, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4261
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 6358 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4001a46300?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 6371
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5184 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5183
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 6366 [IO wait]:
internal/poll.runtime_pollWait(0xffff4c8d5600, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001818b80?, 0x40008c9000?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001818b80, {0x40008c9000, 0x1800, 0x1800})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
net.(*netFD).Read(0x4001818b80, {0x40008c9000?, 0x40008c9000?, 0x5?})
	/usr/local/go/src/net/fd_posix.go:68 +0x28
net.(*conn).Read(0x4001c54138, {0x40008c9000?, 0x4001391888?, 0x8b27c?})
	/usr/local/go/src/net/net.go:196 +0x34
crypto/tls.(*atLeastReader).Read(0x4001c3ca80, {0x40008c9000?, 0x40013918e8?, 0x2cb794?})
	/usr/local/go/src/crypto/tls/conn.go:816 +0x38
bytes.(*Buffer).ReadFrom(0x40004dfea8, {0x369dda0, 0x4001c3ca80})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
crypto/tls.(*Conn).readFromUntil(0x40004dfc08, {0xffff4c8c3a80, 0x4001c3c6c0}, 0x4001391990?)
	/usr/local/go/src/crypto/tls/conn.go:838 +0xcc
crypto/tls.(*Conn).readRecordOrCCS(0x40004dfc08, 0x0)
	/usr/local/go/src/crypto/tls/conn.go:627 +0x340
crypto/tls.(*Conn).readRecord(...)
	/usr/local/go/src/crypto/tls/conn.go:589
crypto/tls.(*Conn).Read(0x40004dfc08, {0x40019c9000, 0x1000, 0x542e2c?})
	/usr/local/go/src/crypto/tls/conn.go:1392 +0x14c
bufio.(*Reader).Read(0x40013c89c0, {0x40018d6f20, 0x9, 0x542e44?})
	/usr/local/go/src/bufio/bufio.go:245 +0x188
io.ReadAtLeast({0x369bce0, 0x40013c89c0}, {0x40018d6f20, 0x9, 0x9}, 0x9)
	/usr/local/go/src/io/io.go:335 +0x98
io.ReadFull(...)
	/usr/local/go/src/io/io.go:354
golang.org/x/net/http2.readFrameHeader({0x40018d6f20, 0x9, 0x4000659aa0?}, {0x369bce0?, 0x40013c89c0?})
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/frame.go:242 +0x58
golang.org/x/net/http2.(*Framer).ReadFrame(0x40018d6ee0)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/frame.go:506 +0x70
golang.org/x/net/http2.(*clientConnReadLoop).run(0x4001391f98)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/transport.go:2258 +0xcc
golang.org/x/net/http2.(*ClientConn).readLoop(0x400184b500)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/transport.go:2127 +0x6c
created by golang.org/x/net/http2.(*Transport).newClientConn in goroutine 6365
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/transport.go:912 +0xae0

                                                
                                                
goroutine 3582 [chan receive]:
testing.(*T).Run(0x400156c700, {0x297625b?, 0x3689f58?}, 0x4001c17aa0)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x400156c700)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:148 +0x724
testing.tRunner(0x400156c700, 0x4001920100)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3513
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4864 [chan receive, 8 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001a67020, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4862
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 6077 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x40000a7740, 0x40000a7788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x0?, 0x40000a7740, 0x40000a7788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x36e5778?, 0x4001647dc0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x4001647ce0?, 0x0?, 0x4001417c80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6073
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 6078 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 6077
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5788 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x400184b180?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5784
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5809 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x4001332f40, 0x4001332f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0xb3?, 0x4001332f40, 0x4001332f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x0?, 0x4001332f50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000294000?, 0x400184b180?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5789
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4262 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4000335a40?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4261
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5789 [chan receive, 4 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019025a0, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5784
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5182 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40018b1250, 0xe)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40018b1240)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013c8840)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40018932d0?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x400133c6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x40000d6f38, {0x369d680, 0x400186d4a0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369d680?, 0x400186d4a0?}, 0x30?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40016cdbd0, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5179
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4557 [chan receive, 10 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40004eee40, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4552
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3798 [chan receive, 32 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40015d0c60, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3793
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3802 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x4001336740, 0x4001d86f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x18?, 0x4001336740, 0x4001336788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x40003387f0?, 0x40003387f0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001a2ba40?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3798
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3902 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3901
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3797 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000294000?}, 0x4001a2bc00?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3793
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4267 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x40000a6f40, 0x40000a6f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x60?, 0x40000a6f40, 0x40000a6f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x0?, 0x95c64?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001c44b90?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4263
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5517 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000298150}, 0x40013a2740, 0x40013a2788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000298150}, 0x0?, 0x40013a2740, 0x40013a2788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000298150?}, 0x36e5778?, 0x4001c8d030?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x4001c8cf50?, 0x0?, 0x400070d500?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5513
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5516 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0x40019c77d0, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019c77c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001962fc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40002fd8f0?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x400136ff38, {0x369d680, 0x40006e56b0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x40006e56b0?}, 0x20?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400077f470, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5513
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 6362 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x400075a8d0, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400075a8c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001c4d1a0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40018c2d20?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000298150?}, 0x40017106a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000298150}, 0x4001390f38, {0x369d680, 0x40015933e0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40017107a8?, {0x369d680?, 0x40015933e0?}, 0x30?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40016cc550, 0x3b9aca00, 0x0, 0x1, 0x4000298150)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6359
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                    

Test pass (262/321)

Order passed test Duration
3 TestDownloadOnly/v1.28.0/json-events 43.15
4 TestDownloadOnly/v1.28.0/preload-exists 0
8 TestDownloadOnly/v1.28.0/LogsDuration 0.09
9 TestDownloadOnly/v1.28.0/DeleteAll 0.23
10 TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds 0.15
12 TestDownloadOnly/v1.34.2/json-events 33.54
13 TestDownloadOnly/v1.34.2/preload-exists 0
17 TestDownloadOnly/v1.34.2/LogsDuration 0.09
18 TestDownloadOnly/v1.34.2/DeleteAll 0.22
19 TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds 0.14
21 TestDownloadOnly/v1.35.0-beta.0/json-events 2.33
23 TestDownloadOnly/v1.35.0-beta.0/cached-images 0
24 TestDownloadOnly/v1.35.0-beta.0/binaries 0
26 TestDownloadOnly/v1.35.0-beta.0/LogsDuration 0.06
27 TestDownloadOnly/v1.35.0-beta.0/DeleteAll 0.2
28 TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds 0.13
30 TestBinaryMirror 0.64
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.08
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.08
36 TestAddons/Setup 168.85
38 TestAddons/serial/Volcano 41.68
40 TestAddons/serial/GCPAuth/Namespaces 0.18
41 TestAddons/serial/GCPAuth/FakeCredentials 8.9
44 TestAddons/parallel/Registry 15.27
45 TestAddons/parallel/RegistryCreds 0.76
46 TestAddons/parallel/Ingress 17.87
47 TestAddons/parallel/InspektorGadget 11.81
48 TestAddons/parallel/MetricsServer 6.26
50 TestAddons/parallel/CSI 39.33
51 TestAddons/parallel/Headlamp 17.5
52 TestAddons/parallel/CloudSpanner 6.71
53 TestAddons/parallel/LocalPath 52.67
54 TestAddons/parallel/NvidiaDevicePlugin 5.63
55 TestAddons/parallel/Yakd 11.91
57 TestAddons/StoppedEnableDisable 12.34
58 TestCertOptions 36.84
59 TestCertExpiration 230.96
61 TestForceSystemdFlag 39.17
62 TestForceSystemdEnv 40.59
63 TestDockerEnvContainerd 47.57
67 TestErrorSpam/setup 32.82
68 TestErrorSpam/start 0.86
69 TestErrorSpam/status 1.31
70 TestErrorSpam/pause 1.8
71 TestErrorSpam/unpause 1.68
72 TestErrorSpam/stop 1.6
75 TestFunctional/serial/CopySyncFile 0
76 TestFunctional/serial/StartWithProxy 82.27
77 TestFunctional/serial/AuditLog 0
78 TestFunctional/serial/SoftStart 7.32
79 TestFunctional/serial/KubeContext 0.06
80 TestFunctional/serial/KubectlGetPods 0.12
83 TestFunctional/serial/CacheCmd/cache/add_remote 3.5
84 TestFunctional/serial/CacheCmd/cache/add_local 1.4
85 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.06
86 TestFunctional/serial/CacheCmd/cache/list 0.06
87 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.31
88 TestFunctional/serial/CacheCmd/cache/cache_reload 1.9
89 TestFunctional/serial/CacheCmd/cache/delete 0.12
90 TestFunctional/serial/MinikubeKubectlCmd 0.15
91 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.13
92 TestFunctional/serial/ExtraConfig 62.76
93 TestFunctional/serial/ComponentHealth 0.11
94 TestFunctional/serial/LogsCmd 1.49
95 TestFunctional/serial/LogsFileCmd 1.51
96 TestFunctional/serial/InvalidService 4.95
98 TestFunctional/parallel/ConfigCmd 0.68
99 TestFunctional/parallel/DashboardCmd 8.7
100 TestFunctional/parallel/DryRun 0.46
101 TestFunctional/parallel/InternationalLanguage 0.22
102 TestFunctional/parallel/StatusCmd 1.3
106 TestFunctional/parallel/ServiceCmdConnect 7.78
107 TestFunctional/parallel/AddonsCmd 0.15
108 TestFunctional/parallel/PersistentVolumeClaim 24.85
110 TestFunctional/parallel/SSHCmd 0.72
111 TestFunctional/parallel/CpCmd 2.65
113 TestFunctional/parallel/FileSync 0.37
114 TestFunctional/parallel/CertSync 2.31
118 TestFunctional/parallel/NodeLabels 0.09
120 TestFunctional/parallel/NonActiveRuntimeDisabled 0.59
122 TestFunctional/parallel/License 0.32
124 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.69
125 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0
127 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 9.44
128 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.08
129 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0
133 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.11
134 TestFunctional/parallel/ServiceCmd/DeployApp 8.28
135 TestFunctional/parallel/ProfileCmd/profile_not_create 0.46
136 TestFunctional/parallel/ServiceCmd/List 0.63
137 TestFunctional/parallel/ProfileCmd/profile_list 0.57
138 TestFunctional/parallel/ProfileCmd/profile_json_output 0.55
139 TestFunctional/parallel/ServiceCmd/JSONOutput 0.63
140 TestFunctional/parallel/MountCmd/any-port 8.86
141 TestFunctional/parallel/ServiceCmd/HTTPS 0.55
142 TestFunctional/parallel/ServiceCmd/Format 0.38
143 TestFunctional/parallel/ServiceCmd/URL 0.57
144 TestFunctional/parallel/MountCmd/specific-port 2.13
145 TestFunctional/parallel/MountCmd/VerifyCleanup 1.37
146 TestFunctional/parallel/Version/short 0.07
147 TestFunctional/parallel/Version/components 1.37
148 TestFunctional/parallel/ImageCommands/ImageListShort 0.31
149 TestFunctional/parallel/ImageCommands/ImageListTable 0.27
150 TestFunctional/parallel/ImageCommands/ImageListJson 0.33
151 TestFunctional/parallel/ImageCommands/ImageListYaml 0.27
152 TestFunctional/parallel/ImageCommands/ImageBuild 4.08
153 TestFunctional/parallel/ImageCommands/Setup 0.65
154 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.33
155 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.25
156 TestFunctional/parallel/UpdateContextCmd/no_changes 0.21
157 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.24
158 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.19
159 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.41
160 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.48
161 TestFunctional/parallel/ImageCommands/ImageRemove 0.5
162 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.6
163 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.4
164 TestFunctional/delete_echo-server_images 0.04
165 TestFunctional/delete_my-image_image 0.02
166 TestFunctional/delete_minikube_cached_images 0.02
170 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile 0
172 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog 0
174 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext 0.07
178 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote 3.17
179 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local 1.08
180 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete 0.05
181 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list 0.06
182 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node 0.29
183 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload 1.86
184 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete 0.11
189 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd 0.94
190 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd 1.04
193 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd 0.43
195 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun 0.44
196 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage 0.2
202 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd 0.14
205 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd 0.72
206 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd 2.38
208 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync 0.29
209 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync 1.74
215 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled 0.59
217 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License 0.3
220 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel 0
227 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel 0.1
234 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create 0.42
235 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list 0.39
236 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output 0.39
238 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port 2.11
239 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup 1.28
240 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short 0.08
241 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components 0.51
242 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort 0.24
243 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable 0.23
244 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson 0.23
245 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml 0.24
246 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild 3.47
247 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup 0.27
248 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon 1.14
249 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon 1.07
250 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon 1.31
251 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile 0.34
252 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove 0.48
253 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile 0.67
254 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon 0.38
255 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes 0.17
256 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster 0.19
257 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters 0.17
258 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images 0.04
259 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image 0.02
260 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images 0.02
264 TestMultiControlPlane/serial/StartCluster 209.24
265 TestMultiControlPlane/serial/DeployApp 8.72
266 TestMultiControlPlane/serial/PingHostFromPods 1.81
267 TestMultiControlPlane/serial/AddWorkerNode 28.81
268 TestMultiControlPlane/serial/NodeLabels 0.11
269 TestMultiControlPlane/serial/HAppyAfterClusterStart 1.11
270 TestMultiControlPlane/serial/CopyFile 20.53
271 TestMultiControlPlane/serial/StopSecondaryNode 13.02
272 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.83
273 TestMultiControlPlane/serial/RestartSecondaryNode 13.98
274 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 1.37
275 TestMultiControlPlane/serial/RestartClusterKeepsNodes 91.94
276 TestMultiControlPlane/serial/DeleteSecondaryNode 11.31
277 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.84
278 TestMultiControlPlane/serial/StopCluster 36.41
279 TestMultiControlPlane/serial/RestartCluster 59.83
280 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.81
281 TestMultiControlPlane/serial/AddSecondaryNode 49.13
282 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 1.17
287 TestJSONOutput/start/Command 80.77
288 TestJSONOutput/start/Audit 0
290 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
291 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
293 TestJSONOutput/pause/Command 0.72
294 TestJSONOutput/pause/Audit 0
296 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
297 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
299 TestJSONOutput/unpause/Command 0.67
300 TestJSONOutput/unpause/Audit 0
302 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
303 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
305 TestJSONOutput/stop/Command 6.06
306 TestJSONOutput/stop/Audit 0
308 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
309 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
310 TestErrorJSONOutput 0.24
312 TestKicCustomNetwork/create_custom_network 55.7
313 TestKicCustomNetwork/use_default_bridge_network 34.01
314 TestKicExistingNetwork 34.15
315 TestKicCustomSubnet 34.93
316 TestKicStaticIP 35.97
317 TestMainNoArgs 0.05
318 TestMinikubeProfile 75.98
321 TestMountStart/serial/StartWithMountFirst 8.51
322 TestMountStart/serial/VerifyMountFirst 0.27
323 TestMountStart/serial/StartWithMountSecond 6.45
324 TestMountStart/serial/VerifyMountSecond 0.29
325 TestMountStart/serial/DeleteFirst 1.72
326 TestMountStart/serial/VerifyMountPostDelete 0.33
327 TestMountStart/serial/Stop 1.3
328 TestMountStart/serial/RestartStopped 8.11
329 TestMountStart/serial/VerifyMountPostStop 0.28
332 TestMultiNode/serial/FreshStart2Nodes 134.59
333 TestMultiNode/serial/DeployApp2Nodes 6.01
334 TestMultiNode/serial/PingHostFrom2Pods 1
335 TestMultiNode/serial/AddNode 58.5
336 TestMultiNode/serial/MultiNodeLabels 0.09
337 TestMultiNode/serial/ProfileList 0.73
338 TestMultiNode/serial/CopyFile 10.75
339 TestMultiNode/serial/StopNode 2.39
340 TestMultiNode/serial/StartAfterStop 7.87
341 TestMultiNode/serial/RestartKeepsNodes 82.92
342 TestMultiNode/serial/DeleteNode 5.99
343 TestMultiNode/serial/StopMultiNode 24.21
344 TestMultiNode/serial/RestartMultiNode 58.15
345 TestMultiNode/serial/ValidateNameConflict 34.25
350 TestPreload 113.79
352 TestScheduledStopUnix 106.45
355 TestInsufficientStorage 9.93
356 TestRunningBinaryUpgrade 321.51
359 TestMissingContainerUpgrade 188.27
361 TestNoKubernetes/serial/StartNoK8sWithVersion 0.09
362 TestNoKubernetes/serial/StartWithK8s 40.59
363 TestNoKubernetes/serial/StartWithStopK8s 8.81
364 TestNoKubernetes/serial/Start 8.51
365 TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads 0
366 TestNoKubernetes/serial/VerifyK8sNotRunning 0.27
367 TestNoKubernetes/serial/ProfileList 1
368 TestNoKubernetes/serial/Stop 1.32
369 TestNoKubernetes/serial/StartNoArgs 7.45
370 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.38
371 TestStoppedBinaryUpgrade/Setup 1.01
372 TestStoppedBinaryUpgrade/Upgrade 303.75
373 TestStoppedBinaryUpgrade/MinikubeLogs 1.99
382 TestPause/serial/Start 79.38
383 TestPause/serial/SecondStartNoReconfiguration 8.54
387 TestPause/serial/Pause 0.91
388 TestPause/serial/VerifyStatus 0.43
389 TestPause/serial/Unpause 0.86
395 TestPause/serial/PauseAgain 1.18
396 TestPause/serial/DeletePaused 3.3
397 TestPause/serial/VerifyDeletedResources 0.2
x
+
TestDownloadOnly/v1.28.0/json-events (43.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-979775 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-979775 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (43.152782748s)
--- PASS: TestDownloadOnly/v1.28.0/json-events (43.15s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/preload-exists
I1201 19:05:59.290643    4305 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
I1201 19:05:59.290722    4305 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.28.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-979775
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-979775: exit status 85 (88.84337ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬──────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │ END TIME │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼──────────┤
	│ start   │ -o=json --download-only -p download-only-979775 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-979775 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │          │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴──────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:05:16
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:05:16.179136    4311 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:05:16.179273    4311 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:05:16.179282    4311 out.go:374] Setting ErrFile to fd 2...
	I1201 19:05:16.179288    4311 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:05:16.179559    4311 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	W1201 19:05:16.179682    4311 root.go:314] Error reading config file at /home/jenkins/minikube-integration/21997-2497/.minikube/config/config.json: open /home/jenkins/minikube-integration/21997-2497/.minikube/config/config.json: no such file or directory
	I1201 19:05:16.180064    4311 out.go:368] Setting JSON to true
	I1201 19:05:16.180822    4311 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":2868,"bootTime":1764613049,"procs":150,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:05:16.180890    4311 start.go:143] virtualization:  
	I1201 19:05:16.186472    4311 out.go:99] [download-only-979775] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	W1201 19:05:16.186654    4311 preload.go:354] Failed to list preload files: open /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball: no such file or directory
	I1201 19:05:16.186765    4311 notify.go:221] Checking for updates...
	I1201 19:05:16.190327    4311 out.go:171] MINIKUBE_LOCATION=21997
	I1201 19:05:16.193439    4311 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:05:16.196660    4311 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:05:16.199703    4311 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:05:16.202651    4311 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1201 19:05:16.208574    4311 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1201 19:05:16.208861    4311 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:05:16.232991    4311 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:05:16.233123    4311 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:05:16.658906    4311 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-01 19:05:16.645832926 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:05:16.659007    4311 docker.go:319] overlay module found
	I1201 19:05:16.662076    4311 out.go:99] Using the docker driver based on user configuration
	I1201 19:05:16.662114    4311 start.go:309] selected driver: docker
	I1201 19:05:16.662121    4311 start.go:927] validating driver "docker" against <nil>
	I1201 19:05:16.662216    4311 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:05:16.718732    4311 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:25 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-01 19:05:16.708971031 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:05:16.718879    4311 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1201 19:05:16.719143    4311 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1201 19:05:16.719309    4311 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1201 19:05:16.722343    4311 out.go:171] Using Docker driver with root privileges
	I1201 19:05:16.725173    4311 cni.go:84] Creating CNI manager for ""
	I1201 19:05:16.725236    4311 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:05:16.725250    4311 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1201 19:05:16.725322    4311 start.go:353] cluster config:
	{Name:download-only-979775 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:download-only-979775 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:05:16.728287    4311 out.go:99] Starting "download-only-979775" primary control-plane node in "download-only-979775" cluster
	I1201 19:05:16.728306    4311 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:05:16.731346    4311 out.go:99] Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:05:16.731399    4311 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1201 19:05:16.731600    4311 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:05:16.747919    4311 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b to local cache
	I1201 19:05:16.748174    4311 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local cache directory
	I1201 19:05:16.748327    4311 image.go:150] Writing gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b to local cache
	I1201 19:05:16.782786    4311 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1201 19:05:16.782823    4311 cache.go:65] Caching tarball of preloaded images
	I1201 19:05:16.783010    4311 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1201 19:05:16.786285    4311 out.go:99] Downloading Kubernetes v1.28.0 preload ...
	I1201 19:05:16.786341    4311 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1201 19:05:16.867558    4311 preload.go:295] Got checksum from GCS API "38d7f581f2fa4226c8af2c9106b982b7"
	I1201 19:05:16.867686    4311 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4?checksum=md5:38d7f581f2fa4226c8af2c9106b982b7 -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1201 19:05:22.465649    4311 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b as a tarball
	
	
	* The control-plane node download-only-979775 host does not exist
	  To start a cluster, run: "minikube start -p download-only-979775"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.0/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAll (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.28.0/DeleteAll (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-979775
--- PASS: TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.15s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/json-events (33.54s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-272875 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-272875 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (33.543203031s)
--- PASS: TestDownloadOnly/v1.34.2/json-events (33.54s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/preload-exists
I1201 19:06:33.302876    4305 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
I1201 19:06:33.302915    4305 preload.go:203] Found local preload: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.34.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-272875
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-272875: exit status 85 (87.808982ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-979775 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-979775 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                 │ minikube             │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │ 01 Dec 25 19:05 UTC │
	│ delete  │ -p download-only-979775                                                                                                                                                               │ download-only-979775 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │ 01 Dec 25 19:05 UTC │
	│ start   │ -o=json --download-only -p download-only-272875 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-272875 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:05:59
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:05:59.798132    4507 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:05:59.798247    4507 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:05:59.798253    4507 out.go:374] Setting ErrFile to fd 2...
	I1201 19:05:59.798258    4507 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:05:59.798661    4507 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:05:59.799131    4507 out.go:368] Setting JSON to true
	I1201 19:05:59.799852    4507 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":2911,"bootTime":1764613049,"procs":144,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:05:59.799938    4507 start.go:143] virtualization:  
	I1201 19:05:59.803298    4507 out.go:99] [download-only-272875] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:05:59.803524    4507 notify.go:221] Checking for updates...
	I1201 19:05:59.806534    4507 out.go:171] MINIKUBE_LOCATION=21997
	I1201 19:05:59.809566    4507 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:05:59.812414    4507 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:05:59.815374    4507 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:05:59.818263    4507 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1201 19:05:59.823919    4507 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1201 19:05:59.824203    4507 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:05:59.846977    4507 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:05:59.847081    4507 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:05:59.916518    4507 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-01 19:05:59.907532625 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:05:59.916622    4507 docker.go:319] overlay module found
	I1201 19:05:59.919518    4507 out.go:99] Using the docker driver based on user configuration
	I1201 19:05:59.919560    4507 start.go:309] selected driver: docker
	I1201 19:05:59.919568    4507 start.go:927] validating driver "docker" against <nil>
	I1201 19:05:59.919697    4507 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:05:59.984332    4507 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-01 19:05:59.974904935 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:05:59.984495    4507 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1201 19:05:59.984778    4507 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1201 19:05:59.984929    4507 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1201 19:05:59.988180    4507 out.go:171] Using Docker driver with root privileges
	I1201 19:05:59.990987    4507 cni.go:84] Creating CNI manager for ""
	I1201 19:05:59.991057    4507 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1201 19:05:59.991070    4507 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1201 19:05:59.991145    4507 start.go:353] cluster config:
	{Name:download-only-272875 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:download-only-272875 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:05:59.993989    4507 out.go:99] Starting "download-only-272875" primary control-plane node in "download-only-272875" cluster
	I1201 19:05:59.994007    4507 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1201 19:05:59.996793    4507 out.go:99] Pulling base image v0.0.48-1764169655-21974 ...
	I1201 19:05:59.996831    4507 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1201 19:05:59.996988    4507 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1201 19:06:00.013643    4507 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b to local cache
	I1201 19:06:00.013792    4507 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local cache directory
	I1201 19:06:00.013816    4507 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local cache directory, skipping pull
	I1201 19:06:00.013824    4507 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in cache, skipping pull
	I1201 19:06:00.013832    4507 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b as a tarball
	I1201 19:06:00.055824    4507 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.2/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1201 19:06:00.055851    4507 cache.go:65] Caching tarball of preloaded images
	I1201 19:06:00.056066    4507 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1201 19:06:00.059433    4507 out.go:99] Downloading Kubernetes v1.34.2 preload ...
	I1201 19:06:00.059476    4507 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1201 19:06:00.153453    4507 preload.go:295] Got checksum from GCS API "cd1a05d5493c9270e248bf47fb3f071d"
	I1201 19:06:00.153536    4507 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.2/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4?checksum=md5:cd1a05d5493c9270e248bf47fb3f071d -> /home/jenkins/minikube-integration/21997-2497/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-272875 host does not exist
	  To start a cluster, run: "minikube start -p download-only-272875"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.34.2/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.34.2/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-272875
--- PASS: TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/json-events (2.33s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-624539 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-624539 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (2.328563874s)
--- PASS: TestDownloadOnly/v1.35.0-beta.0/json-events (2.33s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/cached-images
--- PASS: TestDownloadOnly/v1.35.0-beta.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/binaries
--- PASS: TestDownloadOnly/v1.35.0-beta.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/LogsDuration (0.06s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-624539
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-624539: exit status 85 (64.280108ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                             ARGS                                                                                             │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-979775 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd        │ download-only-979775 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                        │ minikube             │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │ 01 Dec 25 19:05 UTC │
	│ delete  │ -p download-only-979775                                                                                                                                                                      │ download-only-979775 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │ 01 Dec 25 19:05 UTC │
	│ start   │ -o=json --download-only -p download-only-272875 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd        │ download-only-272875 │ jenkins │ v1.37.0 │ 01 Dec 25 19:05 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                        │ minikube             │ jenkins │ v1.37.0 │ 01 Dec 25 19:06 UTC │ 01 Dec 25 19:06 UTC │
	│ delete  │ -p download-only-272875                                                                                                                                                                      │ download-only-272875 │ jenkins │ v1.37.0 │ 01 Dec 25 19:06 UTC │ 01 Dec 25 19:06 UTC │
	│ start   │ -o=json --download-only -p download-only-624539 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-624539 │ jenkins │ v1.37.0 │ 01 Dec 25 19:06 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/01 19:06:33
	Running on machine: ip-172-31-24-2
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1201 19:06:33.797637    4700 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:06:33.797759    4700 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:06:33.797771    4700 out.go:374] Setting ErrFile to fd 2...
	I1201 19:06:33.797776    4700 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:06:33.798071    4700 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:06:33.798497    4700 out.go:368] Setting JSON to true
	I1201 19:06:33.799272    4700 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":2945,"bootTime":1764613049,"procs":144,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:06:33.799338    4700 start.go:143] virtualization:  
	I1201 19:06:33.802759    4700 out.go:99] [download-only-624539] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:06:33.803004    4700 notify.go:221] Checking for updates...
	I1201 19:06:33.806053    4700 out.go:171] MINIKUBE_LOCATION=21997
	I1201 19:06:33.809353    4700 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:06:33.812388    4700 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:06:33.815293    4700 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:06:33.818324    4700 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1201 19:06:33.824383    4700 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1201 19:06:33.824697    4700 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:06:33.850787    4700 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:06:33.850888    4700 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:06:33.914604    4700 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-01 19:06:33.905768648 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:06:33.914712    4700 docker.go:319] overlay module found
	I1201 19:06:33.917712    4700 out.go:99] Using the docker driver based on user configuration
	I1201 19:06:33.917752    4700 start.go:309] selected driver: docker
	I1201 19:06:33.917759    4700 start.go:927] validating driver "docker" against <nil>
	I1201 19:06:33.917865    4700 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:06:33.969919    4700 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-01 19:06:33.961281821 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:06:33.970079    4700 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1201 19:06:33.970366    4700 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1201 19:06:33.970519    4700 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1201 19:06:33.973444    4700 out.go:171] Using Docker driver with root privileges
	
	
	* The control-plane node download-only-624539 host does not exist
	  To start a cluster, run: "minikube start -p download-only-624539"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.35.0-beta.0/LogsDuration (0.06s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/DeleteAll (0.2s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.35.0-beta.0/DeleteAll (0.20s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-624539
--- PASS: TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds (0.13s)

                                                
                                    
x
+
TestBinaryMirror (0.64s)

                                                
                                                
=== RUN   TestBinaryMirror
I1201 19:06:37.478982    4305 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubectl.sha256
aaa_download_only_test.go:309: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p binary-mirror-523365 --alsologtostderr --binary-mirror http://127.0.0.1:37429 --driver=docker  --container-runtime=containerd
helpers_test.go:175: Cleaning up "binary-mirror-523365" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p binary-mirror-523365
--- PASS: TestBinaryMirror (0.64s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1000: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-569760
addons_test.go:1000: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable dashboard -p addons-569760: exit status 85 (77.265772ms)

                                                
                                                
-- stdout --
	* Profile "addons-569760" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-569760"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1011: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-569760
addons_test.go:1011: (dbg) Non-zero exit: out/minikube-linux-arm64 addons disable dashboard -p addons-569760: exit status 85 (80.542949ms)

                                                
                                                
-- stdout --
	* Profile "addons-569760" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-569760"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/Setup (168.85s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p addons-569760 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher
addons_test.go:108: (dbg) Done: out/minikube-linux-arm64 start -p addons-569760 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher: (2m48.845092242s)
--- PASS: TestAddons/Setup (168.85s)

                                                
                                    
x
+
TestAddons/serial/Volcano (41.68s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:884: volcano-controller stabilized in 48.5028ms
addons_test.go:868: volcano-scheduler stabilized in 48.572332ms
addons_test.go:876: volcano-admission stabilized in 48.856491ms
addons_test.go:890: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-scheduler-76c996c8bf-d7knn" [39d25b45-9996-4b2b-8e3e-ebfc6e61848a] Running
addons_test.go:890: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 6.007476613s
addons_test.go:894: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-admission-6c447bd768-zp6mb" [e571017a-abbd-4d5f-a8a2-b0ee73efdd4d] Running
addons_test.go:894: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 6.003642385s
addons_test.go:898: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-controllers-6fd4f85cb8-hckpr" [dbe3002d-84c5-40e1-b774-2ce9bf264363] Running
addons_test.go:898: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 5.004159083s
addons_test.go:903: (dbg) Run:  kubectl --context addons-569760 delete -n volcano-system job volcano-admission-init
addons_test.go:909: (dbg) Run:  kubectl --context addons-569760 create -f testdata/vcjob.yaml
addons_test.go:917: (dbg) Run:  kubectl --context addons-569760 get vcjob -n my-volcano
addons_test.go:935: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:352: "test-job-nginx-0" [4c112791-405f-4a5a-809d-f45aa2c5d25a] Pending
helpers_test.go:352: "test-job-nginx-0" [4c112791-405f-4a5a-809d-f45aa2c5d25a] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "test-job-nginx-0" [4c112791-405f-4a5a-809d-f45aa2c5d25a] Running
addons_test.go:935: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 12.004002009s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable volcano --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable volcano --alsologtostderr -v=1: (11.985407902s)
--- PASS: TestAddons/serial/Volcano (41.68s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.18s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:630: (dbg) Run:  kubectl --context addons-569760 create ns new-namespace
addons_test.go:644: (dbg) Run:  kubectl --context addons-569760 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.18s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/FakeCredentials (8.9s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/FakeCredentials
addons_test.go:675: (dbg) Run:  kubectl --context addons-569760 create -f testdata/busybox.yaml
addons_test.go:682: (dbg) Run:  kubectl --context addons-569760 create sa gcp-auth-test
addons_test.go:688: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:352: "busybox" [7842e6e7-3598-4043-8a8f-3ac05ad172e9] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:352: "busybox" [7842e6e7-3598-4043-8a8f-3ac05ad172e9] Running
addons_test.go:688: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: integration-test=busybox healthy within 8.003975366s
addons_test.go:694: (dbg) Run:  kubectl --context addons-569760 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:706: (dbg) Run:  kubectl --context addons-569760 describe sa gcp-auth-test
addons_test.go:720: (dbg) Run:  kubectl --context addons-569760 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:744: (dbg) Run:  kubectl --context addons-569760 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
--- PASS: TestAddons/serial/GCPAuth/FakeCredentials (8.90s)

                                                
                                    
x
+
TestAddons/parallel/Registry (15.27s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:382: registry stabilized in 16.032017ms
addons_test.go:384: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:352: "registry-6b586f9694-vqs27" [68d29de4-fa41-4121-9a15-66bff7e9d9d7] Running
addons_test.go:384: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.008039054s
addons_test.go:387: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:352: "registry-proxy-55xw9" [0e046532-0636-4405-b672-c54b2cc21b87] Running
addons_test.go:387: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.005936362s
addons_test.go:392: (dbg) Run:  kubectl --context addons-569760 delete po -l run=registry-test --now
addons_test.go:397: (dbg) Run:  kubectl --context addons-569760 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:397: (dbg) Done: kubectl --context addons-569760 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.179007818s)
addons_test.go:411: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 ip
2025/12/01 19:10:41 [DEBUG] GET http://192.168.49.2:5000
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (15.27s)

                                                
                                    
x
+
TestAddons/parallel/RegistryCreds (0.76s)

                                                
                                                
=== RUN   TestAddons/parallel/RegistryCreds
=== PAUSE TestAddons/parallel/RegistryCreds

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/RegistryCreds
addons_test.go:323: registry-creds stabilized in 7.856673ms
addons_test.go:325: (dbg) Run:  out/minikube-linux-arm64 addons configure registry-creds -f ./testdata/addons_testconfig.json -p addons-569760
addons_test.go:332: (dbg) Run:  kubectl --context addons-569760 -n kube-system get secret -o yaml
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable registry-creds --alsologtostderr -v=1
--- PASS: TestAddons/parallel/RegistryCreds (0.76s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (17.87s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-569760 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-569760 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-569760 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:352: "nginx" [c6ff8de0-4bd8-4e66-829f-d143647994ea] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "nginx" [c6ff8de0-4bd8-4e66-829f-d143647994ea] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 7.008349364s
I1201 19:11:55.786863    4305 kapi.go:150] Service nginx in namespace default found.
addons_test.go:264: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-569760 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.168.49.2
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable ingress-dns --alsologtostderr -v=1: (1.26908353s)
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable ingress --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable ingress --alsologtostderr -v=1: (7.836589443s)
--- PASS: TestAddons/parallel/Ingress (17.87s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (11.81s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:823: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:352: "gadget-7mq4p" [a291afd7-8abc-464f-b476-efc511c0dca1] Running
addons_test.go:823: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 6.004581551s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable inspektor-gadget --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable inspektor-gadget --alsologtostderr -v=1: (5.800731611s)
--- PASS: TestAddons/parallel/InspektorGadget (11.81s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (6.26s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:455: metrics-server stabilized in 51.779792ms
addons_test.go:457: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:352: "metrics-server-85b7d694d7-bwrlx" [1148000a-39e7-4c69-b5ce-cb5caa2926f0] Running
addons_test.go:457: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.00696666s
addons_test.go:463: (dbg) Run:  kubectl --context addons-569760 top pods -n kube-system
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable metrics-server --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable metrics-server --alsologtostderr -v=1: (1.088154539s)
--- PASS: TestAddons/parallel/MetricsServer (6.26s)

                                                
                                    
x
+
TestAddons/parallel/CSI (39.33s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
I1201 19:11:08.770145    4305 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
I1201 19:11:08.774633    4305 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
I1201 19:11:08.774659    4305 kapi.go:107] duration metric: took 6.841257ms to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
addons_test.go:549: csi-hostpath-driver pods stabilized in 6.856249ms
addons_test.go:552: (dbg) Run:  kubectl --context addons-569760 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:557: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:562: (dbg) Run:  kubectl --context addons-569760 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:567: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:352: "task-pv-pod" [1ede4927-1083-46ac-93e0-9afa50054163] Pending
helpers_test.go:352: "task-pv-pod" [1ede4927-1083-46ac-93e0-9afa50054163] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:352: "task-pv-pod" [1ede4927-1083-46ac-93e0-9afa50054163] Running
addons_test.go:567: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 7.003585853s
addons_test.go:572: (dbg) Run:  kubectl --context addons-569760 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:577: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:427: (dbg) Run:  kubectl --context addons-569760 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:427: (dbg) Run:  kubectl --context addons-569760 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:582: (dbg) Run:  kubectl --context addons-569760 delete pod task-pv-pod
addons_test.go:582: (dbg) Done: kubectl --context addons-569760 delete pod task-pv-pod: (1.064257713s)
addons_test.go:588: (dbg) Run:  kubectl --context addons-569760 delete pvc hpvc
addons_test.go:594: (dbg) Run:  kubectl --context addons-569760 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:599: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:604: (dbg) Run:  kubectl --context addons-569760 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:609: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:352: "task-pv-pod-restore" [02b20a1a-65fa-4c46-9270-bc42d4cfcdbb] Pending
helpers_test.go:352: "task-pv-pod-restore" [02b20a1a-65fa-4c46-9270-bc42d4cfcdbb] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:352: "task-pv-pod-restore" [02b20a1a-65fa-4c46-9270-bc42d4cfcdbb] Running
addons_test.go:609: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 8.00360163s
addons_test.go:614: (dbg) Run:  kubectl --context addons-569760 delete pod task-pv-pod-restore
addons_test.go:614: (dbg) Done: kubectl --context addons-569760 delete pod task-pv-pod-restore: (1.320706276s)
addons_test.go:618: (dbg) Run:  kubectl --context addons-569760 delete pvc hpvc-restore
addons_test.go:622: (dbg) Run:  kubectl --context addons-569760 delete volumesnapshot new-snapshot-demo
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable volumesnapshots --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable volumesnapshots --alsologtostderr -v=1: (1.226587736s)
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.936795211s)
--- PASS: TestAddons/parallel/CSI (39.33s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (17.5s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:808: (dbg) Run:  out/minikube-linux-arm64 addons enable headlamp -p addons-569760 --alsologtostderr -v=1
addons_test.go:808: (dbg) Done: out/minikube-linux-arm64 addons enable headlamp -p addons-569760 --alsologtostderr -v=1: (1.624627943s)
addons_test.go:813: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:352: "headlamp-dfcdc64b-7rqv4" [6fe528b7-0ec8-4757-a93b-d9d2b331fb2a] Pending
helpers_test.go:352: "headlamp-dfcdc64b-7rqv4" [6fe528b7-0ec8-4757-a93b-d9d2b331fb2a] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:352: "headlamp-dfcdc64b-7rqv4" [6fe528b7-0ec8-4757-a93b-d9d2b331fb2a] Running
addons_test.go:813: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.002996493s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable headlamp --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable headlamp --alsologtostderr -v=1: (5.868278176s)
--- PASS: TestAddons/parallel/Headlamp (17.50s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (6.71s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:840: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:352: "cloud-spanner-emulator-5bdddb765-cxljv" [fb4fba2b-57e4-4c8d-a667-49b0fb13cea9] Running
addons_test.go:840: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 6.003327667s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable cloud-spanner --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CloudSpanner (6.71s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (52.67s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:949: (dbg) Run:  kubectl --context addons-569760 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:955: (dbg) Run:  kubectl --context addons-569760 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:959: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:962: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:352: "test-local-path" [f0b24f5d-4904-45c8-94ed-9e8921deb768] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:352: "test-local-path" [f0b24f5d-4904-45c8-94ed-9e8921deb768] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:352: "test-local-path" [f0b24f5d-4904-45c8-94ed-9e8921deb768] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:962: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 3.003693563s
addons_test.go:967: (dbg) Run:  kubectl --context addons-569760 get pvc test-pvc -o=json
addons_test.go:976: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 ssh "cat /opt/local-path-provisioner/pvc-cba66158-a49a-431e-ba41-675423300895_default_test-pvc/file1"
addons_test.go:988: (dbg) Run:  kubectl --context addons-569760 delete pod test-local-path
addons_test.go:992: (dbg) Run:  kubectl --context addons-569760 delete pvc test-pvc
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (43.215692986s)
--- PASS: TestAddons/parallel/LocalPath (52.67s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.63s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1025: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:352: "nvidia-device-plugin-daemonset-6gg44" [389d06ec-b97b-4a0e-a915-0558014ac606] Running
addons_test.go:1025: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.004634653s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable nvidia-device-plugin --alsologtostderr -v=1
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.63s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (11.91s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1047: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:352: "yakd-dashboard-5ff678cb9-dd552" [5e8fc77f-2415-48a6-8020-501f4a16ef3b] Running
addons_test.go:1047: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 6.004083076s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-569760 addons disable yakd --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-569760 addons disable yakd --alsologtostderr -v=1: (5.908854434s)
--- PASS: TestAddons/parallel/Yakd (11.91s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (12.34s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:172: (dbg) Run:  out/minikube-linux-arm64 stop -p addons-569760
addons_test.go:172: (dbg) Done: out/minikube-linux-arm64 stop -p addons-569760: (12.055628187s)
addons_test.go:176: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-569760
addons_test.go:180: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-569760
addons_test.go:185: (dbg) Run:  out/minikube-linux-arm64 addons disable gvisor -p addons-569760
--- PASS: TestAddons/StoppedEnableDisable (12.34s)

                                                
                                    
x
+
TestCertOptions (36.84s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-arm64 start -p cert-options-741184 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-arm64 start -p cert-options-741184 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd: (33.774700869s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-arm64 -p cert-options-741184 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-741184 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-arm64 ssh -p cert-options-741184 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-741184" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-options-741184
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p cert-options-741184: (2.276273427s)
--- PASS: TestCertOptions (36.84s)

                                                
                                    
x
+
TestCertExpiration (230.96s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-553175 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd
cert_options_test.go:123: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-553175 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd: (40.521057997s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-553175 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd
cert_options_test.go:131: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-553175 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd: (7.596985826s)
helpers_test.go:175: Cleaning up "cert-expiration-553175" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-expiration-553175
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p cert-expiration-553175: (2.838226027s)
--- PASS: TestCertExpiration (230.96s)

                                                
                                    
x
+
TestForceSystemdFlag (39.17s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-flag-782329 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:91: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-flag-782329 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (36.345025267s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-flag-782329 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-flag-782329" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-flag-782329
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-flag-782329: (2.352153581s)
--- PASS: TestForceSystemdFlag (39.17s)

                                                
                                    
x
+
TestForceSystemdEnv (40.59s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-env-481462 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:155: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-env-481462 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (37.278527417s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-env-481462 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-env-481462" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-env-481462
E1201 20:31:46.971828    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-env-481462: (2.867696514s)
--- PASS: TestForceSystemdEnv (40.59s)

                                                
                                    
x
+
TestDockerEnvContainerd (47.57s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd true linux arm64
docker_test.go:181: (dbg) Run:  out/minikube-linux-arm64 start -p dockerenv-019170 --driver=docker  --container-runtime=containerd
docker_test.go:181: (dbg) Done: out/minikube-linux-arm64 start -p dockerenv-019170 --driver=docker  --container-runtime=containerd: (31.423450369s)
docker_test.go:189: (dbg) Run:  /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-019170"
docker_test.go:189: (dbg) Done: /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-019170": (1.108000103s)
docker_test.go:220: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-6GBePN6xV4QU/agent.23866" SSH_AGENT_PID="23867" DOCKER_HOST=ssh://docker@127.0.0.1:32773 docker version"
docker_test.go:243: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-6GBePN6xV4QU/agent.23866" SSH_AGENT_PID="23867" DOCKER_HOST=ssh://docker@127.0.0.1:32773 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env"
docker_test.go:243: (dbg) Done: /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-6GBePN6xV4QU/agent.23866" SSH_AGENT_PID="23867" DOCKER_HOST=ssh://docker@127.0.0.1:32773 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env": (1.357916443s)
docker_test.go:250: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-6GBePN6xV4QU/agent.23866" SSH_AGENT_PID="23867" DOCKER_HOST=ssh://docker@127.0.0.1:32773 docker image ls"
helpers_test.go:175: Cleaning up "dockerenv-019170" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p dockerenv-019170
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p dockerenv-019170: (2.213007659s)
--- PASS: TestDockerEnvContainerd (47.57s)

                                                
                                    
x
+
TestErrorSpam/setup (32.82s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-arm64 start -p nospam-320062 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-320062 --driver=docker  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-arm64 start -p nospam-320062 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-320062 --driver=docker  --container-runtime=containerd: (32.823956889s)
--- PASS: TestErrorSpam/setup (32.82s)

                                                
                                    
x
+
TestErrorSpam/start (0.86s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:206: Cleaning up 1 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 start --dry-run
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 start --dry-run
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 start --dry-run
--- PASS: TestErrorSpam/start (0.86s)

                                                
                                    
x
+
TestErrorSpam/status (1.31s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 status
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 status
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 status
--- PASS: TestErrorSpam/status (1.31s)

                                                
                                    
x
+
TestErrorSpam/pause (1.8s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 pause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 pause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 pause
--- PASS: TestErrorSpam/pause (1.80s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.68s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 unpause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 unpause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 unpause
--- PASS: TestErrorSpam/unpause (1.68s)

                                                
                                    
x
+
TestErrorSpam/stop (1.6s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 stop
error_spam_test.go:149: (dbg) Done: out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 stop: (1.404054568s)
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 stop
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-320062 --log_dir /tmp/nospam-320062 stop
--- PASS: TestErrorSpam/stop (1.60s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (82.27s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-019259 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd
E1201 19:14:27.095153    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.101539    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.112893    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.134160    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.175533    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.256901    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.418167    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:27.739544    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:28.380953    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:29.663228    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:32.226068    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:37.348321    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:14:47.589795    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:15:08.071183    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Done: out/minikube-linux-arm64 start -p functional-019259 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd: (1m22.265806648s)
--- PASS: TestFunctional/serial/StartWithProxy (82.27s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (7.32s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
I1201 19:15:18.932974    4305 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-019259 --alsologtostderr -v=8
functional_test.go:674: (dbg) Done: out/minikube-linux-arm64 start -p functional-019259 --alsologtostderr -v=8: (7.321621234s)
functional_test.go:678: soft start took 7.322985753s for "functional-019259" cluster.
I1201 19:15:26.254972    4305 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
--- PASS: TestFunctional/serial/SoftStart (7.32s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.06s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.12s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-019259 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.12s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.5s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 cache add registry.k8s.io/pause:3.1: (1.281424305s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 cache add registry.k8s.io/pause:3.3: (1.104066597s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 cache add registry.k8s.io/pause:latest: (1.110384549s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.50s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.4s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-019259 /tmp/TestFunctionalserialCacheCmdcacheadd_local2194809079/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cache add minikube-local-cache-test:functional-019259
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cache delete minikube-local-cache-test:functional-019259
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-019259
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.40s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.31s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.31s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.9s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (310.83586ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.90s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.12s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.12s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.15s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 kubectl -- --context functional-019259 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.15s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-019259 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.13s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (62.76s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-019259 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1201 19:15:49.033634    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Done: out/minikube-linux-arm64 start -p functional-019259 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (1m2.755565854s)
functional_test.go:776: restart took 1m2.755665328s for "functional-019259" cluster.
I1201 19:16:36.819853    4305 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
--- PASS: TestFunctional/serial/ExtraConfig (62.76s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.11s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-019259 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:840: etcd phase: Running
functional_test.go:850: etcd status: Ready
functional_test.go:840: kube-apiserver phase: Running
functional_test.go:850: kube-apiserver status: Ready
functional_test.go:840: kube-controller-manager phase: Running
functional_test.go:850: kube-controller-manager status: Ready
functional_test.go:840: kube-scheduler phase: Running
functional_test.go:850: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.11s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.49s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 logs: (1.490958265s)
--- PASS: TestFunctional/serial/LogsCmd (1.49s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.51s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 logs --file /tmp/TestFunctionalserialLogsFileCmd2829253215/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 logs --file /tmp/TestFunctionalserialLogsFileCmd2829253215/001/logs.txt: (1.504514207s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.51s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.95s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-019259 apply -f testdata/invalidsvc.yaml
functional_test.go:2340: (dbg) Run:  out/minikube-linux-arm64 service invalid-svc -p functional-019259
functional_test.go:2340: (dbg) Non-zero exit: out/minikube-linux-arm64 service invalid-svc -p functional-019259: exit status 115 (458.719893ms)

                                                
                                                
-- stdout --
	┌───────────┬─────────────┬─────────────┬───────────────────────────┐
	│ NAMESPACE │    NAME     │ TARGET PORT │            URL            │
	├───────────┼─────────────┼─────────────┼───────────────────────────┤
	│ default   │ invalid-svc │ 80          │ http://192.168.49.2:31549 │
	└───────────┴─────────────┴─────────────┴───────────────────────────┘
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2332: (dbg) Run:  kubectl --context functional-019259 delete -f testdata/invalidsvc.yaml
functional_test.go:2332: (dbg) Done: kubectl --context functional-019259 delete -f testdata/invalidsvc.yaml: (1.169236505s)
--- PASS: TestFunctional/serial/InvalidService (4.95s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 config get cpus: exit status 14 (110.298593ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 config get cpus: exit status 14 (65.795352ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.68s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (8.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-019259 --alsologtostderr -v=1]
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-019259 --alsologtostderr -v=1] ...
helpers_test.go:525: unable to kill pid 38971: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (8.70s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-019259 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-019259 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (203.868329ms)

                                                
                                                
-- stdout --
	* [functional-019259] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:17:17.118205   38721 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:17:17.118351   38721 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:17:17.118359   38721 out.go:374] Setting ErrFile to fd 2...
	I1201 19:17:17.118372   38721 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:17:17.118731   38721 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:17:17.119182   38721 out.go:368] Setting JSON to false
	I1201 19:17:17.120386   38721 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":3589,"bootTime":1764613049,"procs":206,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:17:17.120464   38721 start.go:143] virtualization:  
	I1201 19:17:17.123711   38721 out.go:179] * [functional-019259] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:17:17.127659   38721 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:17:17.127776   38721 notify.go:221] Checking for updates...
	I1201 19:17:17.133616   38721 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:17:17.136662   38721 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:17:17.139518   38721 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:17:17.142372   38721 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:17:17.145296   38721 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:17:17.148792   38721 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 19:17:17.149389   38721 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:17:17.182896   38721 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:17:17.182998   38721 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:17:17.252303   38721 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-01 19:17:17.231119899 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:17:17.252411   38721 docker.go:319] overlay module found
	I1201 19:17:17.255374   38721 out.go:179] * Using the docker driver based on existing profile
	I1201 19:17:17.258160   38721 start.go:309] selected driver: docker
	I1201 19:17:17.258180   38721 start.go:927] validating driver "docker" against &{Name:functional-019259 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-019259 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:17:17.258293   38721 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:17:17.261708   38721 out.go:203] 
	W1201 19:17:17.264588   38721 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1201 19:17:17.267551   38721 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-019259 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-019259 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-019259 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (221.494281ms)

                                                
                                                
-- stdout --
	* [functional-019259] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:17:16.922092   38643 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:17:16.922330   38643 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:17:16.922359   38643 out.go:374] Setting ErrFile to fd 2...
	I1201 19:17:16.922378   38643 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:17:16.923493   38643 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:17:16.923959   38643 out.go:368] Setting JSON to false
	I1201 19:17:16.924989   38643 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":3588,"bootTime":1764613049,"procs":206,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:17:16.925089   38643 start.go:143] virtualization:  
	I1201 19:17:16.930487   38643 out.go:179] * [functional-019259] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1201 19:17:16.933611   38643 notify.go:221] Checking for updates...
	I1201 19:17:16.933617   38643 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:17:16.936655   38643 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:17:16.940143   38643 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:17:16.943012   38643 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:17:16.945954   38643 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:17:16.948875   38643 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:17:16.952245   38643 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 19:17:16.952833   38643 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:17:16.978731   38643 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:17:16.978849   38643 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:17:17.047748   38643 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-01 19:17:17.035482527 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:17:17.047865   38643 docker.go:319] overlay module found
	I1201 19:17:17.051001   38643 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1201 19:17:17.054008   38643 start.go:309] selected driver: docker
	I1201 19:17:17.054035   38643 start.go:927] validating driver "docker" against &{Name:functional-019259 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-019259 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:17:17.054150   38643 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:17:17.057728   38643 out.go:203] 
	W1201 19:17:17.060713   38643 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1201 19:17:17.063684   38643 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (1.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 status
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (1.30s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (7.78s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-019259 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1640: (dbg) Run:  kubectl --context functional-019259 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:352: "hello-node-connect-7d85dfc575-hbwcs" [4c33820d-b887-45d7-a99f-4d622675da32] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:352: "hello-node-connect-7d85dfc575-hbwcs" [4c33820d-b887-45d7-a99f-4d622675da32] Running
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 7.003332588s
functional_test.go:1654: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 service hello-node-connect --url
functional_test.go:1660: found endpoint for hello-node-connect: http://192.168.49.2:31617
functional_test.go:1680: http://192.168.49.2:31617: success! body:
Request served by hello-node-connect-7d85dfc575-hbwcs

                                                
                                                
HTTP/1.1 GET /

                                                
                                                
Host: 192.168.49.2:31617
Accept-Encoding: gzip
User-Agent: Go-http-client/1.1
--- PASS: TestFunctional/parallel/ServiceCmdConnect (7.78s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (24.85s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:352: "storage-provisioner" [909ec402-94c6-47ce-9662-a33d194aa5b6] Running
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.0039808s
functional_test_pvc_test.go:55: (dbg) Run:  kubectl --context functional-019259 get storageclass -o=json
functional_test_pvc_test.go:75: (dbg) Run:  kubectl --context functional-019259 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:82: (dbg) Run:  kubectl --context functional-019259 get pvc myclaim -o=json
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-019259 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:352: "sp-pod" [7bd360cf-810f-4bc7-9740-96a900a8b4ae] Pending
helpers_test.go:352: "sp-pod" [7bd360cf-810f-4bc7-9740-96a900a8b4ae] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:352: "sp-pod" [7bd360cf-810f-4bc7-9740-96a900a8b4ae] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 9.006940044s
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-019259 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:112: (dbg) Run:  kubectl --context functional-019259 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:112: (dbg) Done: kubectl --context functional-019259 delete -f testdata/storage-provisioner/pod.yaml: (1.651684627s)
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-019259 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:352: "sp-pod" [d079a7db-2045-44a9-be95-f25687ce059b] Pending
helpers_test.go:352: "sp-pod" [d079a7db-2045-44a9-be95-f25687ce059b] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:352: "sp-pod" [d079a7db-2045-44a9-be95-f25687ce059b] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.003265047s
functional_test_pvc_test.go:120: (dbg) Run:  kubectl --context functional-019259 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (24.85s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.72s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.72s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (2.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh -n functional-019259 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cp functional-019259:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd3385059404/001/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh -n functional-019259 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh -n functional-019259 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (2.65s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/4305/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /etc/test/nested/copy/4305/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (2.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/4305.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /etc/ssl/certs/4305.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/4305.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /usr/share/ca-certificates/4305.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/43052.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /etc/ssl/certs/43052.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/43052.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /usr/share/ca-certificates/43052.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (2.31s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-019259 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh "sudo systemctl is-active docker": exit status 1 (300.208192ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh "sudo systemctl is-active crio": exit status 1 (288.922564ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.59s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctional/parallel/License (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.69s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-019259 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-019259 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-019259 tunnel --alsologtostderr] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-019259 tunnel --alsologtostderr] ...
helpers_test.go:525: unable to kill pid 36087: os: process already finished
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.69s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-019259 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (9.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-019259 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:352: "nginx-svc" [8de3f593-8d6a-48a1-93b9-7ff9ea3d1710] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "nginx-svc" [8de3f593-8d6a-48a1-93b9-7ff9ea3d1710] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 9.004185692s
I1201 19:16:56.412213    4305 kapi.go:150] Service nginx-svc in namespace default found.
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (9.44s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-019259 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.08s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.110.12.91 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-019259 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: signal: terminated
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (8.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-019259 create deployment hello-node --image kicbase/echo-server
functional_test.go:1455: (dbg) Run:  kubectl --context functional-019259 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:352: "hello-node-75c85bcc94-89npf" [2018bfce-b9fb-42ee-bb35-ef303580153b] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:352: "hello-node-75c85bcc94-89npf" [2018bfce-b9fb-42ee-bb35-ef303580153b] Running
E1201 19:17:10.955201    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 8.003501831s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (8.28s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.57s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "495.787097ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "72.238566ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.57s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "482.574398ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "65.870975ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.55s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 service list -o json
functional_test.go:1504: Took "634.665631ms" to run "out/minikube-linux-arm64 -p functional-019259 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (8.86s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdany-port3622638413/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1764616633954449607" to /tmp/TestFunctionalparallelMountCmdany-port3622638413/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1764616633954449607" to /tmp/TestFunctionalparallelMountCmdany-port3622638413/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1764616633954449607" to /tmp/TestFunctionalparallelMountCmdany-port3622638413/001/test-1764616633954449607
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (495.237571ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1201 19:17:14.451599    4305 retry.go:31] will retry after 563.483974ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec  1 19:17 created-by-test
-rw-r--r-- 1 docker docker 24 Dec  1 19:17 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec  1 19:17 test-1764616633954449607
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh cat /mount-9p/test-1764616633954449607
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-019259 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:352: "busybox-mount" [33421ef5-52eb-4ff1-9aa4-75043bd96d5b] Pending
helpers_test.go:352: "busybox-mount" [33421ef5-52eb-4ff1-9aa4-75043bd96d5b] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:352: "busybox-mount" [33421ef5-52eb-4ff1-9aa4-75043bd96d5b] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:352: "busybox-mount" [33421ef5-52eb-4ff1-9aa4-75043bd96d5b] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 5.003476033s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-019259 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdany-port3622638413/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (8.86s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 service --namespace=default --https --url hello-node
functional_test.go:1532: found endpoint: https://192.168.49.2:31049
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.55s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.57s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 service hello-node --url
functional_test.go:1575: found endpoint for hello-node: http://192.168.49.2:31049
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.57s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (2.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdspecific-port2051026071/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (547.572114ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1201 19:17:23.358355    4305 retry.go:31] will retry after 431.147535ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdspecific-port2051026071/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh "sudo umount -f /mount-9p": exit status 1 (286.076248ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-019259 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdspecific-port2051026071/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (2.13s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3133801850/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3133801850/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3133801850/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh "findmnt -T" /mount3
2025/12/01 19:17:26 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-019259 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3133801850/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3133801850/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-019259 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3133801850/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.37s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 version --short
--- PASS: TestFunctional/parallel/Version/short (0.07s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (1.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 version -o=json --components
functional_test.go:2275: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 version -o=json --components: (1.371733379s)
--- PASS: TestFunctional/parallel/Version/components (1.37s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-019259 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.34.2
registry.k8s.io/kube-proxy:v1.34.2
registry.k8s.io/kube-controller-manager:v1.34.2
registry.k8s.io/kube-apiserver:v1.34.2
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.12.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/minikube-local-cache-test:functional-019259
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-019259
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-019259 image ls --format short --alsologtostderr:
I1201 19:17:33.244528   41711 out.go:360] Setting OutFile to fd 1 ...
I1201 19:17:33.244728   41711 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.244742   41711 out.go:374] Setting ErrFile to fd 2...
I1201 19:17:33.244748   41711 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.245054   41711 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:17:33.245806   41711 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.245972   41711 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.246547   41711 cli_runner.go:164] Run: docker container inspect functional-019259 --format={{.State.Status}}
I1201 19:17:33.279658   41711 ssh_runner.go:195] Run: systemctl --version
I1201 19:17:33.279720   41711 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-019259
I1201 19:17:33.305110   41711 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-019259/id_rsa Username:docker}
I1201 19:17:33.413138   41711 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-019259 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬────────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG         │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼────────────────────┼───────────────┼────────┤
│ registry.k8s.io/kube-proxy                  │ v1.34.2            │ sha256:94bff1 │ 22.8MB │
│ docker.io/library/nginx                     │ latest             │ sha256:bb747c │ 58.3MB │
│ registry.k8s.io/kube-apiserver              │ v1.34.2            │ sha256:b178af │ 24.6MB │
│ registry.k8s.io/kube-controller-manager     │ v1.34.2            │ sha256:1b3491 │ 20.7MB │
│ registry.k8s.io/pause                       │ latest             │ sha256:8cb209 │ 71.3kB │
│ docker.io/library/minikube-local-cache-test │ functional-019259  │ sha256:4f3a5d │ 992B   │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                 │ sha256:ba04bb │ 8.03MB │
│ registry.k8s.io/pause                       │ 3.10.1             │ sha256:d7b100 │ 268kB  │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b │ sha256:b1a8c6 │ 40.6MB │
│ docker.io/library/nginx                     │ alpine             │ sha256:cbad63 │ 23.1MB │
│ registry.k8s.io/coredns/coredns             │ v1.12.1            │ sha256:138784 │ 20.4MB │
│ registry.k8s.io/pause                       │ 3.1                │ sha256:8057e0 │ 262kB  │
│ docker.io/kicbase/echo-server               │ functional-019259  │ sha256:ce2d2c │ 2.17MB │
│ gcr.io/k8s-minikube/busybox                 │ 1.28.4-glibc       │ sha256:1611cd │ 1.94MB │
│ registry.k8s.io/etcd                        │ 3.6.5-0            │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-scheduler              │ v1.34.2            │ sha256:4f982e │ 15.8MB │
│ registry.k8s.io/pause                       │ 3.3                │ sha256:3d1873 │ 249kB  │
└─────────────────────────────────────────────┴────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-019259 image ls --format table --alsologtostderr:
I1201 19:17:33.544256   41791 out.go:360] Setting OutFile to fd 1 ...
I1201 19:17:33.544393   41791 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.544404   41791 out.go:374] Setting ErrFile to fd 2...
I1201 19:17:33.544410   41791 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.544712   41791 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:17:33.545380   41791 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.545565   41791 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.546130   41791 cli_runner.go:164] Run: docker container inspect functional-019259 --format={{.State.Status}}
I1201 19:17:33.569774   41791 ssh_runner.go:195] Run: systemctl --version
I1201 19:17:33.569827   41791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-019259
I1201 19:17:33.593956   41791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-019259/id_rsa Username:docker}
I1201 19:17:33.701479   41791 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-019259 image ls --format json --alsologtostderr:
[{"id":"sha256:94bff1bec29fd04573941f362e44a6730b151d46df215613feb3f1167703f786","repoDigests":["registry.k8s.io/kube-proxy@sha256:d8b843ac8a5e861238df24a4db8c2ddced89948633400c4660464472045276f5"],"repoTags":["registry.k8s.io/kube-proxy:v1.34.2"],"size":"22802260"},{"id":"sha256:4f3a5d641d9b7a5007231441eda3adf17b6874d8b72429dc7a44618c67a293d6","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-019259"],"size":"992"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sh
a256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8","repoDigests":["docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93"],"repoTags":[],"size":"74084559"},{"id":"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc","repoDigests":["registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"],"repoTags":["registry.k8s.io/coredns/coredns:v1.12.1"],"size":"20392204"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21136588"},{"id":"sha256:4f982e73e768a6ccebb54f8905b83b78d56b3a014e709c0bfe77140db3543949","repoDigests":["registry.k8s.io/kube-schedule
r@sha256:44229946c0966b07d5c0791681d803e77258949985e49b4ab0fbdff99d2a48c6"],"repoTags":["registry.k8s.io/kube-scheduler:v1.34.2"],"size":"15775785"},{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"40636774"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-019259"],"size":"2173567"},{"id":"sha256:bb747ca923a5e1139baddd6f4743e0c0c74df58f4ad8ddbc10ab183b92f5a5c7","repoDigests":["docker.io/library/nginx@sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42"],"repoTags":["docker.io/library/nginx:latest"],"size":"58263548"},{"id":"sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ec
e7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"1935750"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a","repoDigests":["docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c"],"repoTags":[],"size":"18306114"},{"id":"sha256:cbad6347cca28a6ee7b08793856bc6fcb2c2c7a377a62a5e6d785895c4194ac1","repoDigests":["docker.io/library/nginx@sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14"],"repoTags":["docker.io/library/nginx:alpine"],"size":"23117513"},{"id":"sha256:b178af3d91f80925cd8bec42e1813e7d46370236a811d3380c9c10a02b245ca7","repoDigests":["regi
stry.k8s.io/kube-apiserver@sha256:e009ef63deaf797763b5bd423d04a099a2fe414a081bf7d216b43bc9e76b9077"],"repoTags":["registry.k8s.io/kube-apiserver:v1.34.2"],"size":"24559643"},{"id":"sha256:1b34917560f0916ad0d1e98debeaf98c640b68c5a38f6d87711f0e288e5d7be2","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:5c3998664b77441c09a4604f1361b230e63f7a6f299fc02fc1ebd1a12c38e3eb"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.34.2"],"size":"20718696"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-019259 image ls --format json --alsologtostderr:
I1201 19:17:33.520660   41784 out.go:360] Setting OutFile to fd 1 ...
I1201 19:17:33.520912   41784 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.520925   41784 out.go:374] Setting ErrFile to fd 2...
I1201 19:17:33.520964   41784 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.521364   41784 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:17:33.522536   41784 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.522725   41784 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.523295   41784 cli_runner.go:164] Run: docker container inspect functional-019259 --format={{.State.Status}}
I1201 19:17:33.562331   41784 ssh_runner.go:195] Run: systemctl --version
I1201 19:17:33.562468   41784 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-019259
I1201 19:17:33.587385   41784 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-019259/id_rsa Username:docker}
I1201 19:17:33.701711   41784 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-019259 image ls --format yaml --alsologtostderr:
- id: sha256:a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a
repoDigests:
- docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c
repoTags: []
size: "18306114"
- id: sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "1935750"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21136588"
- id: sha256:b178af3d91f80925cd8bec42e1813e7d46370236a811d3380c9c10a02b245ca7
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:e009ef63deaf797763b5bd423d04a099a2fe414a081bf7d216b43bc9e76b9077
repoTags:
- registry.k8s.io/kube-apiserver:v1.34.2
size: "24559643"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:4f3a5d641d9b7a5007231441eda3adf17b6874d8b72429dc7a44618c67a293d6
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-019259
size: "992"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-019259
size: "2173567"
- id: sha256:1b34917560f0916ad0d1e98debeaf98c640b68c5a38f6d87711f0e288e5d7be2
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:5c3998664b77441c09a4604f1361b230e63f7a6f299fc02fc1ebd1a12c38e3eb
repoTags:
- registry.k8s.io/kube-controller-manager:v1.34.2
size: "20718696"
- id: sha256:4f982e73e768a6ccebb54f8905b83b78d56b3a014e709c0bfe77140db3543949
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:44229946c0966b07d5c0791681d803e77258949985e49b4ab0fbdff99d2a48c6
repoTags:
- registry.k8s.io/kube-scheduler:v1.34.2
size: "15775785"
- id: sha256:cbad6347cca28a6ee7b08793856bc6fcb2c2c7a377a62a5e6d785895c4194ac1
repoDigests:
- docker.io/library/nginx@sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14
repoTags:
- docker.io/library/nginx:alpine
size: "23117513"
- id: sha256:bb747ca923a5e1139baddd6f4743e0c0c74df58f4ad8ddbc10ab183b92f5a5c7
repoDigests:
- docker.io/library/nginx@sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42
repoTags:
- docker.io/library/nginx:latest
size: "58263548"
- id: sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c
repoTags:
- registry.k8s.io/coredns/coredns:v1.12.1
size: "20392204"
- id: sha256:20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8
repoDigests:
- docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93
repoTags: []
size: "74084559"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:94bff1bec29fd04573941f362e44a6730b151d46df215613feb3f1167703f786
repoDigests:
- registry.k8s.io/kube-proxy@sha256:d8b843ac8a5e861238df24a4db8c2ddced89948633400c4660464472045276f5
repoTags:
- registry.k8s.io/kube-proxy:v1.34.2
size: "22802260"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-019259 image ls --format yaml --alsologtostderr:
I1201 19:17:33.227815   41710 out.go:360] Setting OutFile to fd 1 ...
I1201 19:17:33.228048   41710 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.228076   41710 out.go:374] Setting ErrFile to fd 2...
I1201 19:17:33.228093   41710 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:33.228402   41710 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:17:33.229177   41710 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.229376   41710 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:33.230095   41710 cli_runner.go:164] Run: docker container inspect functional-019259 --format={{.State.Status}}
I1201 19:17:33.256385   41710 ssh_runner.go:195] Run: systemctl --version
I1201 19:17:33.256445   41710 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-019259
I1201 19:17:33.284356   41710 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-019259/id_rsa Username:docker}
I1201 19:17:33.400193   41710 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (4.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-019259 ssh pgrep buildkitd: exit status 1 (284.580997ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr: (3.555870949s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-019259 image build -t localhost/my-image:functional-019259 testdata/build --alsologtostderr:
I1201 19:17:34.093232   41925 out.go:360] Setting OutFile to fd 1 ...
I1201 19:17:34.093395   41925 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:34.093409   41925 out.go:374] Setting ErrFile to fd 2...
I1201 19:17:34.093416   41925 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:17:34.093768   41925 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:17:34.094463   41925 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:34.097420   41925 config.go:182] Loaded profile config "functional-019259": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1201 19:17:34.098101   41925 cli_runner.go:164] Run: docker container inspect functional-019259 --format={{.State.Status}}
I1201 19:17:34.116029   41925 ssh_runner.go:195] Run: systemctl --version
I1201 19:17:34.116086   41925 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-019259
I1201 19:17:34.135036   41925 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-019259/id_rsa Username:docker}
I1201 19:17:34.248108   41925 build_images.go:162] Building image from path: /tmp/build.2199282208.tar
I1201 19:17:34.248234   41925 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1201 19:17:34.256566   41925 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.2199282208.tar
I1201 19:17:34.260259   41925 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.2199282208.tar: stat -c "%s %y" /var/lib/minikube/build/build.2199282208.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.2199282208.tar': No such file or directory
I1201 19:17:34.260289   41925 ssh_runner.go:362] scp /tmp/build.2199282208.tar --> /var/lib/minikube/build/build.2199282208.tar (3072 bytes)
I1201 19:17:34.280256   41925 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.2199282208
I1201 19:17:34.288474   41925 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.2199282208 -xf /var/lib/minikube/build/build.2199282208.tar
I1201 19:17:34.297010   41925 containerd.go:394] Building image: /var/lib/minikube/build/build.2199282208
I1201 19:17:34.297113   41925 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2199282208 --local dockerfile=/var/lib/minikube/build/build.2199282208 --output type=image,name=localhost/my-image:functional-019259
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.5s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.6s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:fd1e297cc137bb85eb7ae45a94c485c4787c4899f6afad39aecf6cbe80a962d3
#8 exporting manifest sha256:fd1e297cc137bb85eb7ae45a94c485c4787c4899f6afad39aecf6cbe80a962d3 0.0s done
#8 exporting config sha256:2e36a06f870f5d29601ff8b8a744930bbfc6b697ea8989a04c9fc632579b04d0 0.0s done
#8 naming to localhost/my-image:functional-019259 done
#8 DONE 0.2s
I1201 19:17:37.561399   41925 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2199282208 --local dockerfile=/var/lib/minikube/build/build.2199282208 --output type=image,name=localhost/my-image:functional-019259: (3.264254531s)
I1201 19:17:37.561514   41925 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.2199282208
I1201 19:17:37.571711   41925 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.2199282208.tar
I1201 19:17:37.580717   41925 build_images.go:218] Built localhost/my-image:functional-019259 from /tmp/build.2199282208.tar
I1201 19:17:37.580800   41925 build_images.go:134] succeeded building to: functional-019259
I1201 19:17:37.580808   41925 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (4.08s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (0.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-019259
--- PASS: TestFunctional/parallel/ImageCommands/Setup (0.65s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr: (1.04850942s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.25s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-019259
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image load --daemon kicbase/echo-server:functional-019259 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.41s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image save kicbase/echo-server:functional-019259 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.48s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image rm kicbase/echo-server:functional-019259 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.60s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-019259
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-019259 image save --daemon kicbase/echo-server:functional-019259 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-019259
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.40s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-019259
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-019259
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-019259
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/21997-2497/.minikube/files/etc/test/nested/copy/4305/hosts
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote (3.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 cache add registry.k8s.io/pause:3.1: (1.064758793s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 cache add registry.k8s.io/pause:3.3: (1.060473775s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 cache add registry.k8s.io/pause:latest: (1.042982182s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote (3.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local (1.08s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialCach628129678/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cache add minikube-local-cache-test:functional-428744
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cache delete minikube-local-cache-test:functional-428744
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-428744
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local (1.08s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node (0.29s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh sudo crictl images
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node (0.29s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload (1.86s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (280.567014ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload (1.86s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete (0.11s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete (0.11s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd (0.94s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd (0.94s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd (1.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialLogs2028418883/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialLogs2028418883/001/logs.txt: (1.036930892s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd (1.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd (0.43s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 config get cpus: exit status 14 (62.895034ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 config get cpus: exit status 14 (63.866999ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd (0.43s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun (0.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 23 (189.841135ms)

                                                
                                                
-- stdout --
	* [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:46:57.158455   71813 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:46:57.158629   71813 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.158660   71813 out.go:374] Setting ErrFile to fd 2...
	I1201 19:46:57.158681   71813 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:57.158963   71813 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:46:57.159364   71813 out.go:368] Setting JSON to false
	I1201 19:46:57.160170   71813 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":5369,"bootTime":1764613049,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:46:57.160266   71813 start.go:143] virtualization:  
	I1201 19:46:57.163715   71813 out.go:179] * [functional-428744] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1201 19:46:57.167256   71813 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:46:57.167315   71813 notify.go:221] Checking for updates...
	I1201 19:46:57.173019   71813 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:46:57.175926   71813 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:46:57.178898   71813 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:46:57.181801   71813 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:46:57.184620   71813 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:46:57.188021   71813 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:46:57.188614   71813 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:46:57.220373   71813 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:46:57.220518   71813 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.282709   71813 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.272632677 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.282816   71813 docker.go:319] overlay module found
	I1201 19:46:57.285901   71813 out.go:179] * Using the docker driver based on existing profile
	I1201 19:46:57.288726   71813 start.go:309] selected driver: docker
	I1201 19:46:57.288747   71813 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.288853   71813 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:46:57.292538   71813 out.go:203] 
	W1201 19:46:57.295482   71813 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1201 19:46:57.298743   71813 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-428744 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun (0.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage (0.2s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-428744 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 23 (197.853016ms)

                                                
                                                
-- stdout --
	* [functional-428744] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:46:56.969402   71766 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:46:56.969588   71766 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:56.969624   71766 out.go:374] Setting ErrFile to fd 2...
	I1201 19:46:56.969647   71766 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:46:56.970056   71766 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:46:56.970537   71766 out.go:368] Setting JSON to false
	I1201 19:46:56.971384   71766 start.go:133] hostinfo: {"hostname":"ip-172-31-24-2","uptime":5368,"bootTime":1764613049,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"6d436adf-771e-4269-b9a3-c25fd4fca4f5"}
	I1201 19:46:56.971455   71766 start.go:143] virtualization:  
	I1201 19:46:56.974997   71766 out.go:179] * [functional-428744] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1201 19:46:56.978890   71766 out.go:179]   - MINIKUBE_LOCATION=21997
	I1201 19:46:56.978976   71766 notify.go:221] Checking for updates...
	I1201 19:46:56.985004   71766 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1201 19:46:56.987975   71766 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	I1201 19:46:56.990863   71766 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	I1201 19:46:56.994320   71766 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1201 19:46:56.997264   71766 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1201 19:46:57.000638   71766 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1201 19:46:57.001233   71766 driver.go:422] Setting default libvirt URI to qemu:///system
	I1201 19:46:57.035688   71766 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1201 19:46:57.035807   71766 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:46:57.092359   71766 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-01 19:46:57.0831092 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path:
/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:46:57.092485   71766 docker.go:319] overlay module found
	I1201 19:46:57.095659   71766 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1201 19:46:57.098597   71766 start.go:309] selected driver: docker
	I1201 19:46:57.098622   71766 start.go:927] validating driver "docker" against &{Name:functional-428744 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-428744 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1201 19:46:57.098730   71766 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1201 19:46:57.102292   71766 out.go:203] 
	W1201 19:46:57.105218   71766 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1201 19:46:57.108083   71766 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage (0.20s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 addons list -o json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd (0.72s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "cat /etc/hostname"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd (0.72s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd (2.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh -n functional-428744 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cp functional-428744:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelCp929166965/001/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh -n functional-428744 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh -n functional-428744 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd (2.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync (0.29s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/4305/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /etc/test/nested/copy/4305/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync (0.29s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync (1.74s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/4305.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /etc/ssl/certs/4305.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/4305.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /usr/share/ca-certificates/4305.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/43052.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /etc/ssl/certs/43052.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/43052.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /usr/share/ca-certificates/43052.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync (1.74s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled (0.59s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh "sudo systemctl is-active docker": exit status 1 (319.815014ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh "sudo systemctl is-active crio": exit status 1 (274.275496ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled (0.59s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License (0.3s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License (0.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr]
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-428744 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: exit status 103
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create (0.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
E1201 19:46:46.971658    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create (0.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list (0.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "339.65092ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "48.089332ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list (0.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output (0.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "338.955667ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "52.669541ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output (0.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port (2.11s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo894755151/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (348.412112ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1201 19:46:50.684822    4305 retry.go:31] will retry after 748.519444ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo894755151/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh "sudo umount -f /mount-9p": exit status 1 (271.756697ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-428744 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo894755151/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port (2.11s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup (1.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-428744 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-428744 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo4199567437/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup (1.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short (0.08s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 version --short
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short (0.08s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components (0.51s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 version -o=json --components
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components (0.51s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort (0.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-428744 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.35.0-beta.0
registry.k8s.io/kube-proxy:v1.35.0-beta.0
registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
registry.k8s.io/kube-apiserver:v1.35.0-beta.0
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.13.1
gcr.io/k8s-minikube/storage-provisioner:v5
docker.io/library/minikube-local-cache-test:functional-428744
docker.io/kicbase/echo-server:functional-428744
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-428744 image ls --format short --alsologtostderr:
I1201 19:47:10.185915   73994 out.go:360] Setting OutFile to fd 1 ...
I1201 19:47:10.186085   73994 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:10.186116   73994 out.go:374] Setting ErrFile to fd 2...
I1201 19:47:10.186138   73994 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:10.186534   73994 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:47:10.187510   73994 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:10.187732   73994 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:10.188919   73994 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:47:10.206552   73994 ssh_runner.go:195] Run: systemctl --version
I1201 19:47:10.206636   73994 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:47:10.224461   73994 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
I1201 19:47:10.328774   73994 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort (0.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-428744 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬───────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG        │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼───────────────────┼───────────────┼────────┤
│ registry.k8s.io/kube-proxy                  │ v1.35.0-beta.0    │ sha256:404c2e │ 22.4MB │
│ docker.io/library/minikube-local-cache-test │ functional-428744 │ sha256:4f3a5d │ 992B   │
│ registry.k8s.io/kube-scheduler              │ v1.35.0-beta.0    │ sha256:163787 │ 15.4MB │
│ registry.k8s.io/pause                       │ 3.1               │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ 3.10.1            │ sha256:d7b100 │ 265kB  │
│ registry.k8s.io/pause                       │ 3.3               │ sha256:3d1873 │ 249kB  │
│ registry.k8s.io/pause                       │ latest            │ sha256:8cb209 │ 71.3kB │
│ docker.io/kicbase/echo-server               │ functional-428744 │ sha256:ce2d2c │ 2.17MB │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                │ sha256:667491 │ 8.03MB │
│ localhost/my-image                          │ functional-428744 │ sha256:408043 │ 831kB  │
│ registry.k8s.io/coredns/coredns             │ v1.13.1           │ sha256:e08f4d │ 21.2MB │
│ registry.k8s.io/etcd                        │ 3.6.5-0           │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-apiserver              │ v1.35.0-beta.0    │ sha256:ccd634 │ 24.7MB │
│ registry.k8s.io/kube-controller-manager     │ v1.35.0-beta.0    │ sha256:68b5f7 │ 20.7MB │
└─────────────────────────────────────────────┴───────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-428744 image ls --format table --alsologtostderr:
I1201 19:47:14.526893   74437 out.go:360] Setting OutFile to fd 1 ...
I1201 19:47:14.527118   74437 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:14.527132   74437 out.go:374] Setting ErrFile to fd 2...
I1201 19:47:14.527138   74437 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:14.527430   74437 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:47:14.528178   74437 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:14.528345   74437 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:14.528910   74437 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:47:14.546377   74437 ssh_runner.go:195] Run: systemctl --version
I1201 19:47:14.546435   74437 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:47:14.563788   74437 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
I1201 19:47:14.673100   74437 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-428744 image ls --format json --alsologtostderr:
[{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-428744"],"size":"2173567"},{"id":"sha256:4f3a5d641d9b7a5007231441eda3adf17b6874d8b72429dc7a44618c67a293d6","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-428744"],"size":"992"},{"id":"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.13.1"],"size":"21166088"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21134420"},{"id":
"sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.35.0-beta.0"],"size":"24676285"},{"id":"sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.35.0-beta.0"],"size":"22428165"},{"id":"sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.35.0-beta.0"],"size":"15389290"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"265458"},{"id":"sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8032639"},{"id":"sha256:4080434db4544c9ad723959f1a84a91883dfea2013af4850c1bea2166ef7f4e4","repoDigests":[],"repoTags":["localhost/my-image:functional-428744"],"size":"830617"},{"id":
"sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"],"size":"20658969"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-428744 image ls --format json --alsologtostderr:
I1201 19:47:14.300178   74399 out.go:360] Setting OutFile to fd 1 ...
I1201 19:47:14.300327   74399 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:14.300339   74399 out.go:374] Setting ErrFile to fd 2...
I1201 19:47:14.300359   74399 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:14.300749   74399 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:47:14.301768   74399 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:14.301972   74399 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:14.302765   74399 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:47:14.321108   74399 ssh_runner.go:195] Run: systemctl --version
I1201 19:47:14.321169   74399 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:47:14.339756   74399 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
I1201 19:47:14.444030   74399 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml (0.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-428744 image ls --format yaml --alsologtostderr:
- id: sha256:4080434db4544c9ad723959f1a84a91883dfea2013af4850c1bea2166ef7f4e4
repoDigests: []
repoTags:
- localhost/my-image:functional-428744
size: "830617"
- id: sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.35.0-beta.0
size: "24676285"
- id: sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
size: "20658969"
- id: sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.35.0-beta.0
size: "15389290"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:4f3a5d641d9b7a5007231441eda3adf17b6874d8b72429dc7a44618c67a293d6
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-428744
size: "992"
- id: sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8032639"
- id: sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.13.1
size: "21166088"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21134420"
- id: sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.35.0-beta.0
size: "22428165"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.10.1
size: "265458"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-428744
size: "2173567"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-428744 image ls --format yaml --alsologtostderr:
I1201 19:47:14.065880   74356 out.go:360] Setting OutFile to fd 1 ...
I1201 19:47:14.066103   74356 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:14.066133   74356 out.go:374] Setting ErrFile to fd 2...
I1201 19:47:14.066155   74356 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:14.066496   74356 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:47:14.067252   74356 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:14.067471   74356 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:14.068099   74356 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:47:14.088425   74356 ssh_runner.go:195] Run: systemctl --version
I1201 19:47:14.088488   74356 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:47:14.111237   74356 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
I1201 19:47:14.216160   74356 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml (0.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild (3.47s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-428744 ssh pgrep buildkitd: exit status 1 (264.796275ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image build -t localhost/my-image:functional-428744 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-428744 image build -t localhost/my-image:functional-428744 testdata/build --alsologtostderr: (2.991005035s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-428744 image build -t localhost/my-image:functional-428744 testdata/build --alsologtostderr:
I1201 19:47:10.850748   74140 out.go:360] Setting OutFile to fd 1 ...
I1201 19:47:10.850883   74140 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:10.850895   74140 out.go:374] Setting ErrFile to fd 2...
I1201 19:47:10.850902   74140 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1201 19:47:10.851184   74140 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
I1201 19:47:10.851805   74140 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:10.852500   74140 config.go:182] Loaded profile config "functional-428744": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1201 19:47:10.853074   74140 cli_runner.go:164] Run: docker container inspect functional-428744 --format={{.State.Status}}
I1201 19:47:10.871147   74140 ssh_runner.go:195] Run: systemctl --version
I1201 19:47:10.871213   74140 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-428744
I1201 19:47:10.889041   74140 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/functional-428744/id_rsa Username:docker}
I1201 19:47:10.992210   74140 build_images.go:162] Building image from path: /tmp/build.1821128276.tar
I1201 19:47:10.992303   74140 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1201 19:47:11.000245   74140 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.1821128276.tar
I1201 19:47:11.003970   74140 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.1821128276.tar: stat -c "%s %y" /var/lib/minikube/build/build.1821128276.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.1821128276.tar': No such file or directory
I1201 19:47:11.004001   74140 ssh_runner.go:362] scp /tmp/build.1821128276.tar --> /var/lib/minikube/build/build.1821128276.tar (3072 bytes)
I1201 19:47:11.023869   74140 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.1821128276
I1201 19:47:11.033520   74140 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.1821128276 -xf /var/lib/minikube/build/build.1821128276.tar
I1201 19:47:11.042590   74140 containerd.go:394] Building image: /var/lib/minikube/build/build.1821128276
I1201 19:47:11.042666   74140 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1821128276 --local dockerfile=/var/lib/minikube/build/build.1821128276 --output type=image,name=localhost/my-image:functional-428744
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.5s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.2s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:4e7e74bc4eedcfa49f6ca9b3c075a7ff3dcbbee7118055dad7e18c23289011fa
#8 exporting manifest sha256:4e7e74bc4eedcfa49f6ca9b3c075a7ff3dcbbee7118055dad7e18c23289011fa 0.0s done
#8 exporting config sha256:4080434db4544c9ad723959f1a84a91883dfea2013af4850c1bea2166ef7f4e4 0.0s done
#8 naming to localhost/my-image:functional-428744 done
#8 DONE 0.2s
I1201 19:47:13.764970   74140 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1821128276 --local dockerfile=/var/lib/minikube/build/build.1821128276 --output type=image,name=localhost/my-image:functional-428744: (2.722262155s)
I1201 19:47:13.765052   74140 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.1821128276
I1201 19:47:13.773395   74140 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.1821128276.tar
I1201 19:47:13.781641   74140 build_images.go:218] Built localhost/my-image:functional-428744 from /tmp/build.1821128276.tar
I1201 19:47:13.781682   74140 build_images.go:134] succeeded building to: functional-428744
I1201 19:47:13.781700   74140 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild (3.47s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-428744
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon (1.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image load --daemon kicbase/echo-server:functional-428744 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon (1.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon (1.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image load --daemon kicbase/echo-server:functional-428744 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon (1.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon (1.31s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-428744
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image load --daemon kicbase/echo-server:functional-428744 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon (1.31s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image save kicbase/echo-server:functional-428744 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove (0.48s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image rm kicbase/echo-server:functional-428744 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove (0.48s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile (0.67s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile (0.67s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon (0.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-428744
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 image save --daemon kicbase/echo-server:functional-428744 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-428744
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon (0.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes (0.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes (0.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster (0.19s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster (0.19s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters (0.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-428744 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters (0.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-428744
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-428744
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-428744
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (209.24s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1201 19:49:27.087662    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.602402    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.608795    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.620092    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.641425    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.682790    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.764249    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:52.925793    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:53.247423    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:53.889374    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:55.170673    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:49:57.732140    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:50:02.854092    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:50:13.096365    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:50:33.578360    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:51:14.540364    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:51:46.971878    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:101: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (3m28.282797318s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/StartCluster (209.24s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (8.72s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 kubectl -- rollout status deployment/busybox: (5.691848257s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-85pz9 -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-9nqwt -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-vbmtp -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-85pz9 -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-9nqwt -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-vbmtp -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-85pz9 -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-9nqwt -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-vbmtp -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (8.72s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.81s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-85pz9 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-85pz9 -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-9nqwt -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-9nqwt -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-vbmtp -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
E1201 19:52:36.461779    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 kubectl -- exec busybox-7b57f96db7-vbmtp -- sh -c "ping -c 1 192.168.49.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.81s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (28.81s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node add --alsologtostderr -v 5
ha_test.go:228: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 node add --alsologtostderr -v 5: (27.70971252s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:234: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5: (1.102546852s)
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (28.81s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-308856 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (1.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.107044046s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (1.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (20.53s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:328: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --output json --alsologtostderr -v 5
ha_test.go:328: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 status --output json --alsologtostderr -v 5: (1.134344343s)
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp testdata/cp-test.txt ha-308856:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile1365624573/001/cp-test_ha-308856.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856:/home/docker/cp-test.txt ha-308856-m02:/home/docker/cp-test_ha-308856_ha-308856-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test_ha-308856_ha-308856-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856:/home/docker/cp-test.txt ha-308856-m03:/home/docker/cp-test_ha-308856_ha-308856-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test_ha-308856_ha-308856-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856:/home/docker/cp-test.txt ha-308856-m04:/home/docker/cp-test_ha-308856_ha-308856-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test_ha-308856_ha-308856-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp testdata/cp-test.txt ha-308856-m02:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m02:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile1365624573/001/cp-test_ha-308856-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m02:/home/docker/cp-test.txt ha-308856:/home/docker/cp-test_ha-308856-m02_ha-308856.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test_ha-308856-m02_ha-308856.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m02:/home/docker/cp-test.txt ha-308856-m03:/home/docker/cp-test_ha-308856-m02_ha-308856-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test_ha-308856-m02_ha-308856-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m02:/home/docker/cp-test.txt ha-308856-m04:/home/docker/cp-test_ha-308856-m02_ha-308856-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test_ha-308856-m02_ha-308856-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp testdata/cp-test.txt ha-308856-m03:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m03:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile1365624573/001/cp-test_ha-308856-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m03:/home/docker/cp-test.txt ha-308856:/home/docker/cp-test_ha-308856-m03_ha-308856.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test_ha-308856-m03_ha-308856.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m03:/home/docker/cp-test.txt ha-308856-m02:/home/docker/cp-test_ha-308856-m03_ha-308856-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test_ha-308856-m03_ha-308856-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m03:/home/docker/cp-test.txt ha-308856-m04:/home/docker/cp-test_ha-308856-m03_ha-308856-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test_ha-308856-m03_ha-308856-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp testdata/cp-test.txt ha-308856-m04:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m04:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile1365624573/001/cp-test_ha-308856-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m04:/home/docker/cp-test.txt ha-308856:/home/docker/cp-test_ha-308856-m04_ha-308856.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856 "sudo cat /home/docker/cp-test_ha-308856-m04_ha-308856.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m04:/home/docker/cp-test.txt ha-308856-m02:/home/docker/cp-test_ha-308856-m04_ha-308856-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m02 "sudo cat /home/docker/cp-test_ha-308856-m04_ha-308856-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 cp ha-308856-m04:/home/docker/cp-test.txt ha-308856-m03:/home/docker/cp-test_ha-308856-m04_ha-308856-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 ssh -n ha-308856-m03 "sudo cat /home/docker/cp-test_ha-308856-m04_ha-308856-m03.txt"
--- PASS: TestMultiControlPlane/serial/CopyFile (20.53s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (13.02s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:365: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node stop m02 --alsologtostderr -v 5
ha_test.go:365: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 node stop m02 --alsologtostderr -v 5: (12.197173203s)
ha_test.go:371: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:371: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5: exit status 7 (820.064072ms)

                                                
                                                
-- stdout --
	ha-308856
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-308856-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-308856-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-308856-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:53:39.610880   91712 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:53:39.611037   91712 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:53:39.611068   91712 out.go:374] Setting ErrFile to fd 2...
	I1201 19:53:39.611085   91712 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:53:39.611356   91712 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:53:39.611583   91712 out.go:368] Setting JSON to false
	I1201 19:53:39.611629   91712 mustload.go:66] Loading cluster: ha-308856
	I1201 19:53:39.611733   91712 notify.go:221] Checking for updates...
	I1201 19:53:39.612079   91712 config.go:182] Loaded profile config "ha-308856": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 19:53:39.612099   91712 status.go:174] checking status of ha-308856 ...
	I1201 19:53:39.613044   91712 cli_runner.go:164] Run: docker container inspect ha-308856 --format={{.State.Status}}
	I1201 19:53:39.635280   91712 status.go:371] ha-308856 host status = "Running" (err=<nil>)
	I1201 19:53:39.635301   91712 host.go:66] Checking if "ha-308856" exists ...
	I1201 19:53:39.635616   91712 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-308856
	I1201 19:53:39.671607   91712 host.go:66] Checking if "ha-308856" exists ...
	I1201 19:53:39.671899   91712 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:53:39.671937   91712 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-308856
	I1201 19:53:39.694306   91712 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32793 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/ha-308856/id_rsa Username:docker}
	I1201 19:53:39.815153   91712 ssh_runner.go:195] Run: systemctl --version
	I1201 19:53:39.821793   91712 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:53:39.835104   91712 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 19:53:39.894745   91712 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:4 ContainersRunning:3 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:62 OomKillDisable:true NGoroutines:72 SystemTime:2025-12-01 19:53:39.884766645 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 19:53:39.895420   91712 kubeconfig.go:125] found "ha-308856" server: "https://192.168.49.254:8443"
	I1201 19:53:39.895463   91712 api_server.go:166] Checking apiserver status ...
	I1201 19:53:39.895512   91712 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:53:39.908254   91712 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1367/cgroup
	I1201 19:53:39.916734   91712 api_server.go:182] apiserver freezer: "6:freezer:/docker/4352b6adaeefdfb76ecc136505553b1ffbebbef2de790e13202f4f46a5e42f93/kubepods/burstable/podb1eb57f73886fa49aafadab685e740f9/88e7e9dcf10f603ac242950ef33518f7c7ccd2abd85c6bd28c8e2817e4b96ee4"
	I1201 19:53:39.916814   91712 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/4352b6adaeefdfb76ecc136505553b1ffbebbef2de790e13202f4f46a5e42f93/kubepods/burstable/podb1eb57f73886fa49aafadab685e740f9/88e7e9dcf10f603ac242950ef33518f7c7ccd2abd85c6bd28c8e2817e4b96ee4/freezer.state
	I1201 19:53:39.924197   91712 api_server.go:204] freezer state: "THAWED"
	I1201 19:53:39.924227   91712 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1201 19:53:39.932448   91712 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1201 19:53:39.932479   91712 status.go:463] ha-308856 apiserver status = Running (err=<nil>)
	I1201 19:53:39.932497   91712 status.go:176] ha-308856 status: &{Name:ha-308856 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 19:53:39.932516   91712 status.go:174] checking status of ha-308856-m02 ...
	I1201 19:53:39.932850   91712 cli_runner.go:164] Run: docker container inspect ha-308856-m02 --format={{.State.Status}}
	I1201 19:53:39.951784   91712 status.go:371] ha-308856-m02 host status = "Stopped" (err=<nil>)
	I1201 19:53:39.951812   91712 status.go:384] host is not running, skipping remaining checks
	I1201 19:53:39.951819   91712 status.go:176] ha-308856-m02 status: &{Name:ha-308856-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 19:53:39.951839   91712 status.go:174] checking status of ha-308856-m03 ...
	I1201 19:53:39.952159   91712 cli_runner.go:164] Run: docker container inspect ha-308856-m03 --format={{.State.Status}}
	I1201 19:53:39.969858   91712 status.go:371] ha-308856-m03 host status = "Running" (err=<nil>)
	I1201 19:53:39.969885   91712 host.go:66] Checking if "ha-308856-m03" exists ...
	I1201 19:53:39.970200   91712 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-308856-m03
	I1201 19:53:39.987536   91712 host.go:66] Checking if "ha-308856-m03" exists ...
	I1201 19:53:39.987988   91712 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:53:39.988065   91712 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-308856-m03
	I1201 19:53:40.006958   91712 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32803 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/ha-308856-m03/id_rsa Username:docker}
	I1201 19:53:40.115680   91712 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:53:40.130055   91712 kubeconfig.go:125] found "ha-308856" server: "https://192.168.49.254:8443"
	I1201 19:53:40.130086   91712 api_server.go:166] Checking apiserver status ...
	I1201 19:53:40.130156   91712 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 19:53:40.149841   91712 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1349/cgroup
	I1201 19:53:40.160141   91712 api_server.go:182] apiserver freezer: "6:freezer:/docker/882ece69ad4033852c3b7ab95f26d4af8b5d6933cf347f567b5542437b1842c2/kubepods/burstable/pod6e1694ec635c78a77bc0c19a028aa43a/57d69314524e1d19e8a746b6aba1325c10d2496df9875b5c0b8b478f7b5dd7eb"
	I1201 19:53:40.160273   91712 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/882ece69ad4033852c3b7ab95f26d4af8b5d6933cf347f567b5542437b1842c2/kubepods/burstable/pod6e1694ec635c78a77bc0c19a028aa43a/57d69314524e1d19e8a746b6aba1325c10d2496df9875b5c0b8b478f7b5dd7eb/freezer.state
	I1201 19:53:40.172120   91712 api_server.go:204] freezer state: "THAWED"
	I1201 19:53:40.172149   91712 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1201 19:53:40.182264   91712 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1201 19:53:40.182305   91712 status.go:463] ha-308856-m03 apiserver status = Running (err=<nil>)
	I1201 19:53:40.182316   91712 status.go:176] ha-308856-m03 status: &{Name:ha-308856-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 19:53:40.182334   91712 status.go:174] checking status of ha-308856-m04 ...
	I1201 19:53:40.182665   91712 cli_runner.go:164] Run: docker container inspect ha-308856-m04 --format={{.State.Status}}
	I1201 19:53:40.200500   91712 status.go:371] ha-308856-m04 host status = "Running" (err=<nil>)
	I1201 19:53:40.200538   91712 host.go:66] Checking if "ha-308856-m04" exists ...
	I1201 19:53:40.200948   91712 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-308856-m04
	I1201 19:53:40.225834   91712 host.go:66] Checking if "ha-308856-m04" exists ...
	I1201 19:53:40.226158   91712 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 19:53:40.226200   91712 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-308856-m04
	I1201 19:53:40.246372   91712 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32808 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/ha-308856-m04/id_rsa Username:docker}
	I1201 19:53:40.351437   91712 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 19:53:40.366160   91712 status.go:176] ha-308856-m04 status: &{Name:ha-308856-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopSecondaryNode (13.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.83s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.83s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (13.98s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node start m02 --alsologtostderr -v 5
ha_test.go:422: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 node start m02 --alsologtostderr -v 5: (12.182472268s)
ha_test.go:430: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:430: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5: (1.667206395s)
ha_test.go:450: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiControlPlane/serial/RestartSecondaryNode (13.98s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.37s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.373308177s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.37s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (91.94s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:458: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node list --alsologtostderr -v 5
ha_test.go:464: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 stop --alsologtostderr -v 5
E1201 19:54:27.088053    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:464: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 stop --alsologtostderr -v 5: (31.681437303s)
ha_test.go:469: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 start --wait true --alsologtostderr -v 5
E1201 19:54:50.046353    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:54:52.598484    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 19:55:20.303291    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:469: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 start --wait true --alsologtostderr -v 5: (1m0.081325027s)
ha_test.go:474: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node list --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/RestartClusterKeepsNodes (91.94s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.31s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:489: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node delete m03 --alsologtostderr -v 5
ha_test.go:489: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 node delete m03 --alsologtostderr -v 5: (10.321222618s)
ha_test.go:495: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:513: (dbg) Run:  kubectl get nodes
ha_test.go:521: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/DeleteSecondaryNode (11.31s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.84s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.84s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (36.41s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:533: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 stop --alsologtostderr -v 5
ha_test.go:533: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 stop --alsologtostderr -v 5: (36.278915907s)
ha_test.go:539: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:539: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5: exit status 7 (127.362533ms)

                                                
                                                
-- stdout --
	ha-308856
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-308856-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-308856-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 19:56:16.972116  106807 out.go:360] Setting OutFile to fd 1 ...
	I1201 19:56:16.972259  106807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:56:16.972286  106807 out.go:374] Setting ErrFile to fd 2...
	I1201 19:56:16.972306  106807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 19:56:16.972572  106807 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 19:56:16.972804  106807 out.go:368] Setting JSON to false
	I1201 19:56:16.972860  106807 mustload.go:66] Loading cluster: ha-308856
	I1201 19:56:16.972953  106807 notify.go:221] Checking for updates...
	I1201 19:56:16.973385  106807 config.go:182] Loaded profile config "ha-308856": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 19:56:16.973406  106807 status.go:174] checking status of ha-308856 ...
	I1201 19:56:16.974055  106807 cli_runner.go:164] Run: docker container inspect ha-308856 --format={{.State.Status}}
	I1201 19:56:16.994538  106807 status.go:371] ha-308856 host status = "Stopped" (err=<nil>)
	I1201 19:56:16.994561  106807 status.go:384] host is not running, skipping remaining checks
	I1201 19:56:16.994574  106807 status.go:176] ha-308856 status: &{Name:ha-308856 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 19:56:16.994617  106807 status.go:174] checking status of ha-308856-m02 ...
	I1201 19:56:16.994963  106807 cli_runner.go:164] Run: docker container inspect ha-308856-m02 --format={{.State.Status}}
	I1201 19:56:17.025988  106807 status.go:371] ha-308856-m02 host status = "Stopped" (err=<nil>)
	I1201 19:56:17.026019  106807 status.go:384] host is not running, skipping remaining checks
	I1201 19:56:17.026027  106807 status.go:176] ha-308856-m02 status: &{Name:ha-308856-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 19:56:17.026050  106807 status.go:174] checking status of ha-308856-m04 ...
	I1201 19:56:17.026410  106807 cli_runner.go:164] Run: docker container inspect ha-308856-m04 --format={{.State.Status}}
	I1201 19:56:17.046341  106807 status.go:371] ha-308856-m04 host status = "Stopped" (err=<nil>)
	I1201 19:56:17.046366  106807 status.go:384] host is not running, skipping remaining checks
	I1201 19:56:17.046373  106807 status.go:176] ha-308856-m04 status: &{Name:ha-308856-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (36.41s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (59.83s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:562: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1201 19:56:46.971422    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:562: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (58.854354105s)
ha_test.go:568: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:586: (dbg) Run:  kubectl get nodes
ha_test.go:594: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/RestartCluster (59.83s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.81s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.81s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (49.13s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:607: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 node add --control-plane --alsologtostderr -v 5
ha_test.go:607: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 node add --control-plane --alsologtostderr -v 5: (48.049556734s)
ha_test.go:613: (dbg) Run:  out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5
ha_test.go:613: (dbg) Done: out/minikube-linux-arm64 -p ha-308856 status --alsologtostderr -v 5: (1.077189747s)
--- PASS: TestMultiControlPlane/serial/AddSecondaryNode (49.13s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.17s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.169681309s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.17s)

                                                
                                    
x
+
TestJSONOutput/start/Command (80.77s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-373682 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd
E1201 19:59:27.087195    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p json-output-373682 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd: (1m20.768295589s)
--- PASS: TestJSONOutput/start/Command (80.77s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.72s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 pause -p json-output-373682 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.72s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.67s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 unpause -p json-output-373682 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.67s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (6.06s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 stop -p json-output-373682 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 stop -p json-output-373682 --output=json --user=testUser: (6.061165112s)
--- PASS: TestJSONOutput/stop/Command (6.06s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.24s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-error-427004 --memory=3072 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p json-output-error-427004 --memory=3072 --output=json --wait=true --driver=fail: exit status 56 (93.688843ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"1b4a0bf3-aab0-4ea1-b70e-dd08766c5188","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-427004] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"6ec95323-be04-45b0-bdc7-77d1d106ea7f","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=21997"}}
	{"specversion":"1.0","id":"7a710e3c-c08f-421e-950a-687c298420fc","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"0286d274-2e9d-43dd-8057-188b20f0b7dc","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig"}}
	{"specversion":"1.0","id":"bca280c1-08bf-4eec-8d67-8b1836508abf","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube"}}
	{"specversion":"1.0","id":"53fa1843-bb03-4579-b5c5-0325e279e84d","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"d3ccb9fd-1269-4429-bde7-424ac1941f05","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"28a2e94b-3303-4845-a9af-a37efd4c33b0","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/arm64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-427004" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p json-output-error-427004
--- PASS: TestErrorJSONOutput (0.24s)

                                                
                                    
x
+
TestKicCustomNetwork/create_custom_network (55.7s)

                                                
                                                
=== RUN   TestKicCustomNetwork/create_custom_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-377808 --network=
E1201 19:59:52.598839    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-377808 --network=: (53.450608745s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:175: Cleaning up "docker-network-377808" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-377808
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-377808: (2.209232046s)
--- PASS: TestKicCustomNetwork/create_custom_network (55.70s)

                                                
                                    
x
+
TestKicCustomNetwork/use_default_bridge_network (34.01s)

                                                
                                                
=== RUN   TestKicCustomNetwork/use_default_bridge_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-028710 --network=bridge
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-028710 --network=bridge: (31.857925479s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:175: Cleaning up "docker-network-028710" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-028710
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-028710: (2.128147672s)
--- PASS: TestKicCustomNetwork/use_default_bridge_network (34.01s)

                                                
                                    
x
+
TestKicExistingNetwork (34.15s)

                                                
                                                
=== RUN   TestKicExistingNetwork
I1201 20:01:21.610275    4305 cli_runner.go:164] Run: docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
W1201 20:01:21.626746    4305 cli_runner.go:211] docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
I1201 20:01:21.626827    4305 network_create.go:284] running [docker network inspect existing-network] to gather additional debugging logs...
I1201 20:01:21.626844    4305 cli_runner.go:164] Run: docker network inspect existing-network
W1201 20:01:21.643063    4305 cli_runner.go:211] docker network inspect existing-network returned with exit code 1
I1201 20:01:21.643093    4305 network_create.go:287] error running [docker network inspect existing-network]: docker network inspect existing-network: exit status 1
stdout:
[]

                                                
                                                
stderr:
Error response from daemon: network existing-network not found
I1201 20:01:21.643110    4305 network_create.go:289] output of [docker network inspect existing-network]: -- stdout --
[]

                                                
                                                
-- /stdout --
** stderr ** 
Error response from daemon: network existing-network not found

                                                
                                                
** /stderr **
I1201 20:01:21.643224    4305 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1201 20:01:21.659108    4305 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-4828e2f47bd3 IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:ba:78:79:c6:63:d1} reservation:<nil>}
I1201 20:01:21.659393    4305 network.go:206] using free private subnet 192.168.58.0/24: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x400162f0b0}
I1201 20:01:21.659415    4305 network_create.go:124] attempt to create docker network existing-network 192.168.58.0/24 with gateway 192.168.58.1 and MTU of 1500 ...
I1201 20:01:21.659471    4305 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.58.0/24 --gateway=192.168.58.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=existing-network existing-network
I1201 20:01:21.724508    4305 network_create.go:108] docker network existing-network 192.168.58.0/24 created
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
kic_custom_network_test.go:93: (dbg) Run:  out/minikube-linux-arm64 start -p existing-network-533891 --network=existing-network
E1201 20:01:46.973823    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:93: (dbg) Done: out/minikube-linux-arm64 start -p existing-network-533891 --network=existing-network: (31.852248544s)
helpers_test.go:175: Cleaning up "existing-network-533891" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p existing-network-533891
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p existing-network-533891: (2.146388599s)
I1201 20:01:55.743708    4305 cli_runner.go:164] Run: docker network ls --filter=label=existing-network --format {{.Name}}
--- PASS: TestKicExistingNetwork (34.15s)

                                                
                                    
x
+
TestKicCustomSubnet (34.93s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-subnet-875586 --subnet=192.168.60.0/24
kic_custom_network_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-subnet-875586 --subnet=192.168.60.0/24: (32.606644519s)
kic_custom_network_test.go:161: (dbg) Run:  docker network inspect custom-subnet-875586 --format "{{(index .IPAM.Config 0).Subnet}}"
helpers_test.go:175: Cleaning up "custom-subnet-875586" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p custom-subnet-875586
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p custom-subnet-875586: (2.304107619s)
--- PASS: TestKicCustomSubnet (34.93s)

                                                
                                    
x
+
TestKicStaticIP (35.97s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:132: (dbg) Run:  out/minikube-linux-arm64 start -p static-ip-360418 --static-ip=192.168.200.200
kic_custom_network_test.go:132: (dbg) Done: out/minikube-linux-arm64 start -p static-ip-360418 --static-ip=192.168.200.200: (33.625859871s)
kic_custom_network_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p static-ip-360418 ip
helpers_test.go:175: Cleaning up "static-ip-360418" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p static-ip-360418
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p static-ip-360418: (2.175593402s)
--- PASS: TestKicStaticIP (35.97s)

                                                
                                    
x
+
TestMainNoArgs (0.05s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:70: (dbg) Run:  out/minikube-linux-arm64
--- PASS: TestMainNoArgs (0.05s)

                                                
                                    
x
+
TestMinikubeProfile (75.98s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p first-198020 --driver=docker  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p first-198020 --driver=docker  --container-runtime=containerd: (33.75759086s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p second-200580 --driver=docker  --container-runtime=containerd
E1201 20:04:10.170494    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p second-200580 --driver=docker  --container-runtime=containerd: (36.07555101s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile first-198020
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile second-200580
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
helpers_test.go:175: Cleaning up "second-200580" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p second-200580
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p second-200580: (2.239078121s)
helpers_test.go:175: Cleaning up "first-198020" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p first-198020
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p first-198020: (2.426594573s)
--- PASS: TestMinikubeProfile (75.98s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (8.51s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-1-875115 --memory=3072 --mount-string /tmp/TestMountStartserial3919217323/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
E1201 20:04:27.087039    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-1-875115 --memory=3072 --mount-string /tmp/TestMountStartserial3919217323/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.508014584s)
--- PASS: TestMountStart/serial/StartWithMountFirst (8.51s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-1-875115 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountFirst (0.27s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (6.45s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-877060 --memory=3072 --mount-string /tmp/TestMountStartserial3919217323/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-877060 --memory=3072 --mount-string /tmp/TestMountStartserial3919217323/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (5.453908881s)
--- PASS: TestMountStart/serial/StartWithMountSecond (6.45s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-877060 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (1.72s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p mount-start-1-875115 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p mount-start-1-875115 --alsologtostderr -v=5: (1.723830934s)
--- PASS: TestMountStart/serial/DeleteFirst (1.72s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.33s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-877060 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.33s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.3s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:196: (dbg) Run:  out/minikube-linux-arm64 stop -p mount-start-2-877060
mount_start_test.go:196: (dbg) Done: out/minikube-linux-arm64 stop -p mount-start-2-877060: (1.300125736s)
--- PASS: TestMountStart/serial/Stop (1.30s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (8.11s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:207: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-877060
mount_start_test.go:207: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-877060: (7.109931991s)
--- PASS: TestMountStart/serial/RestartStopped (8.11s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-877060 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (134.59s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-315986 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
E1201 20:04:52.598583    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:06:15.665709    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:06:46.971479    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:96: (dbg) Done: out/minikube-linux-arm64 start -p multinode-315986 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (2m14.051144167s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (134.59s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (6.01s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-arm64 kubectl -p multinode-315986 -- rollout status deployment/busybox: (4.058049523s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-f45p4 -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-lc2rs -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-f45p4 -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-lc2rs -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-f45p4 -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-lc2rs -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (6.01s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (1s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-f45p4 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-f45p4 -- sh -c "ping -c 1 192.168.67.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-lc2rs -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-315986 -- exec busybox-7b57f96db7-lc2rs -- sh -c "ping -c 1 192.168.67.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (1.00s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (58.5s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-315986 -v=5 --alsologtostderr
multinode_test.go:121: (dbg) Done: out/minikube-linux-arm64 node add -p multinode-315986 -v=5 --alsologtostderr: (57.774882394s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (58.50s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-315986 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.73s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.73s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (10.75s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --output json --alsologtostderr
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp testdata/cp-test.txt multinode-315986:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2779379740/001/cp-test_multinode-315986.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986:/home/docker/cp-test.txt multinode-315986-m02:/home/docker/cp-test_multinode-315986_multinode-315986-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m02 "sudo cat /home/docker/cp-test_multinode-315986_multinode-315986-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986:/home/docker/cp-test.txt multinode-315986-m03:/home/docker/cp-test_multinode-315986_multinode-315986-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m03 "sudo cat /home/docker/cp-test_multinode-315986_multinode-315986-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp testdata/cp-test.txt multinode-315986-m02:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2779379740/001/cp-test_multinode-315986-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986-m02:/home/docker/cp-test.txt multinode-315986:/home/docker/cp-test_multinode-315986-m02_multinode-315986.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986 "sudo cat /home/docker/cp-test_multinode-315986-m02_multinode-315986.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986-m02:/home/docker/cp-test.txt multinode-315986-m03:/home/docker/cp-test_multinode-315986-m02_multinode-315986-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m03 "sudo cat /home/docker/cp-test_multinode-315986-m02_multinode-315986-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp testdata/cp-test.txt multinode-315986-m03:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2779379740/001/cp-test_multinode-315986-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986-m03:/home/docker/cp-test.txt multinode-315986:/home/docker/cp-test_multinode-315986-m03_multinode-315986.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986 "sudo cat /home/docker/cp-test_multinode-315986-m03_multinode-315986.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 cp multinode-315986-m03:/home/docker/cp-test.txt multinode-315986-m02:/home/docker/cp-test_multinode-315986-m03_multinode-315986-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 ssh -n multinode-315986-m02 "sudo cat /home/docker/cp-test_multinode-315986-m03_multinode-315986-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (10.75s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.39s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-arm64 -p multinode-315986 node stop m03: (1.31217557s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-315986 status: exit status 7 (536.30388ms)

                                                
                                                
-- stdout --
	multinode-315986
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-315986-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-315986-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr: exit status 7 (536.829462ms)

                                                
                                                
-- stdout --
	multinode-315986
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-315986-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-315986-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 20:08:25.454627  160063 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:08:25.454796  160063 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:08:25.454827  160063 out.go:374] Setting ErrFile to fd 2...
	I1201 20:08:25.454849  160063 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:08:25.455131  160063 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:08:25.455356  160063 out.go:368] Setting JSON to false
	I1201 20:08:25.455426  160063 mustload.go:66] Loading cluster: multinode-315986
	I1201 20:08:25.455503  160063 notify.go:221] Checking for updates...
	I1201 20:08:25.456771  160063 config.go:182] Loaded profile config "multinode-315986": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:08:25.456822  160063 status.go:174] checking status of multinode-315986 ...
	I1201 20:08:25.457539  160063 cli_runner.go:164] Run: docker container inspect multinode-315986 --format={{.State.Status}}
	I1201 20:08:25.476457  160063 status.go:371] multinode-315986 host status = "Running" (err=<nil>)
	I1201 20:08:25.476479  160063 host.go:66] Checking if "multinode-315986" exists ...
	I1201 20:08:25.476927  160063 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-315986
	I1201 20:08:25.504056  160063 host.go:66] Checking if "multinode-315986" exists ...
	I1201 20:08:25.504372  160063 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 20:08:25.504412  160063 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-315986
	I1201 20:08:25.530866  160063 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32913 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/multinode-315986/id_rsa Username:docker}
	I1201 20:08:25.635002  160063 ssh_runner.go:195] Run: systemctl --version
	I1201 20:08:25.641612  160063 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 20:08:25.654652  160063 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1201 20:08:25.710358  160063 info.go:266] docker info: {ID:J4M5:W6MX:GOX4:4LAQ:VI7E:VJNF:J3OP:OPBH:GF7G:PPY4:WQWD:7N4L Containers:3 ContainersRunning:2 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:50 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-01 20:08:25.700196045 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-24-2 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1201 20:08:25.710908  160063 kubeconfig.go:125] found "multinode-315986" server: "https://192.168.67.2:8443"
	I1201 20:08:25.710945  160063 api_server.go:166] Checking apiserver status ...
	I1201 20:08:25.710994  160063 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1201 20:08:25.723630  160063 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1365/cgroup
	I1201 20:08:25.732165  160063 api_server.go:182] apiserver freezer: "6:freezer:/docker/22a729831ec36720452a2ca069b47d40b927526d04703f3345bfa4d330db8b0e/kubepods/burstable/podb4eda0db87508309c8f6985a1573f867/016c4b9b31cead7c6491ef62e443544e7b131d871918e422050dfa49c2b0f232"
	I1201 20:08:25.732240  160063 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/22a729831ec36720452a2ca069b47d40b927526d04703f3345bfa4d330db8b0e/kubepods/burstable/podb4eda0db87508309c8f6985a1573f867/016c4b9b31cead7c6491ef62e443544e7b131d871918e422050dfa49c2b0f232/freezer.state
	I1201 20:08:25.740442  160063 api_server.go:204] freezer state: "THAWED"
	I1201 20:08:25.740472  160063 api_server.go:253] Checking apiserver healthz at https://192.168.67.2:8443/healthz ...
	I1201 20:08:25.748748  160063 api_server.go:279] https://192.168.67.2:8443/healthz returned 200:
	ok
	I1201 20:08:25.748780  160063 status.go:463] multinode-315986 apiserver status = Running (err=<nil>)
	I1201 20:08:25.748790  160063 status.go:176] multinode-315986 status: &{Name:multinode-315986 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 20:08:25.748838  160063 status.go:174] checking status of multinode-315986-m02 ...
	I1201 20:08:25.749174  160063 cli_runner.go:164] Run: docker container inspect multinode-315986-m02 --format={{.State.Status}}
	I1201 20:08:25.766942  160063 status.go:371] multinode-315986-m02 host status = "Running" (err=<nil>)
	I1201 20:08:25.766969  160063 host.go:66] Checking if "multinode-315986-m02" exists ...
	I1201 20:08:25.767288  160063 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-315986-m02
	I1201 20:08:25.784900  160063 host.go:66] Checking if "multinode-315986-m02" exists ...
	I1201 20:08:25.785218  160063 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1201 20:08:25.785261  160063 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-315986-m02
	I1201 20:08:25.802343  160063 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32918 SSHKeyPath:/home/jenkins/minikube-integration/21997-2497/.minikube/machines/multinode-315986-m02/id_rsa Username:docker}
	I1201 20:08:25.903053  160063 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1201 20:08:25.915943  160063 status.go:176] multinode-315986-m02 status: &{Name:multinode-315986-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1201 20:08:25.915977  160063 status.go:174] checking status of multinode-315986-m03 ...
	I1201 20:08:25.916285  160063 cli_runner.go:164] Run: docker container inspect multinode-315986-m03 --format={{.State.Status}}
	I1201 20:08:25.933391  160063 status.go:371] multinode-315986-m03 host status = "Stopped" (err=<nil>)
	I1201 20:08:25.933416  160063 status.go:384] host is not running, skipping remaining checks
	I1201 20:08:25.933423  160063 status.go:176] multinode-315986-m03 status: &{Name:multinode-315986-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.39s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (7.87s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 node start m03 -v=5 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-linux-arm64 -p multinode-315986 node start m03 -v=5 --alsologtostderr: (7.061126062s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status -v=5 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (7.87s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (82.92s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-315986
multinode_test.go:321: (dbg) Run:  out/minikube-linux-arm64 stop -p multinode-315986
multinode_test.go:321: (dbg) Done: out/minikube-linux-arm64 stop -p multinode-315986: (25.165498447s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-315986 --wait=true -v=5 --alsologtostderr
E1201 20:09:27.087252    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:09:52.598828    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:326: (dbg) Done: out/minikube-linux-arm64 start -p multinode-315986 --wait=true -v=5 --alsologtostderr: (57.592038078s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-315986
--- PASS: TestMultiNode/serial/RestartKeepsNodes (82.92s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (5.99s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-arm64 -p multinode-315986 node delete m03: (5.217625803s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (5.99s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (24.21s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 stop
multinode_test.go:345: (dbg) Done: out/minikube-linux-arm64 -p multinode-315986 stop: (24.007728431s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-315986 status: exit status 7 (97.759676ms)

                                                
                                                
-- stdout --
	multinode-315986
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-315986-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr: exit status 7 (102.167734ms)

                                                
                                                
-- stdout --
	multinode-315986
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-315986-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1201 20:10:26.869752  168838 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:10:26.869953  168838 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:10:26.869979  168838 out.go:374] Setting ErrFile to fd 2...
	I1201 20:10:26.869997  168838 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:10:26.870305  168838 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:10:26.870524  168838 out.go:368] Setting JSON to false
	I1201 20:10:26.870585  168838 mustload.go:66] Loading cluster: multinode-315986
	I1201 20:10:26.870662  168838 notify.go:221] Checking for updates...
	I1201 20:10:26.871662  168838 config.go:182] Loaded profile config "multinode-315986": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:10:26.871703  168838 status.go:174] checking status of multinode-315986 ...
	I1201 20:10:26.872363  168838 cli_runner.go:164] Run: docker container inspect multinode-315986 --format={{.State.Status}}
	I1201 20:10:26.893653  168838 status.go:371] multinode-315986 host status = "Stopped" (err=<nil>)
	I1201 20:10:26.893673  168838 status.go:384] host is not running, skipping remaining checks
	I1201 20:10:26.893680  168838 status.go:176] multinode-315986 status: &{Name:multinode-315986 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1201 20:10:26.893709  168838 status.go:174] checking status of multinode-315986-m02 ...
	I1201 20:10:26.894026  168838 cli_runner.go:164] Run: docker container inspect multinode-315986-m02 --format={{.State.Status}}
	I1201 20:10:26.922055  168838 status.go:371] multinode-315986-m02 host status = "Stopped" (err=<nil>)
	I1201 20:10:26.922077  168838 status.go:384] host is not running, skipping remaining checks
	I1201 20:10:26.922085  168838 status.go:176] multinode-315986-m02 status: &{Name:multinode-315986-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (24.21s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (58.15s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-315986 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
multinode_test.go:376: (dbg) Done: out/minikube-linux-arm64 start -p multinode-315986 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (57.435905031s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-arm64 -p multinode-315986 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (58.15s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (34.25s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-315986
multinode_test.go:464: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-315986-m02 --driver=docker  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p multinode-315986-m02 --driver=docker  --container-runtime=containerd: exit status 14 (98.273236ms)

                                                
                                                
-- stdout --
	* [multinode-315986-m02] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-315986-m02' is duplicated with machine name 'multinode-315986-m02' in profile 'multinode-315986'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-315986-m03 --driver=docker  --container-runtime=containerd
E1201 20:11:30.050858    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:11:46.973352    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:472: (dbg) Done: out/minikube-linux-arm64 start -p multinode-315986-m03 --driver=docker  --container-runtime=containerd: (31.625098184s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-315986
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-arm64 node add -p multinode-315986: exit status 80 (377.040679ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-315986 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-315986-m03 already exists in multinode-315986-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-arm64 delete -p multinode-315986-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-arm64 delete -p multinode-315986-m03: (2.09712147s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (34.25s)

                                                
                                    
x
+
TestPreload (113.79s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:41: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-183784 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd
preload_test.go:41: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-183784 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd: (55.201371598s)
preload_test.go:49: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-183784 image pull gcr.io/k8s-minikube/busybox
preload_test.go:49: (dbg) Done: out/minikube-linux-arm64 -p test-preload-183784 image pull gcr.io/k8s-minikube/busybox: (2.392840474s)
preload_test.go:55: (dbg) Run:  out/minikube-linux-arm64 stop -p test-preload-183784
preload_test.go:55: (dbg) Done: out/minikube-linux-arm64 stop -p test-preload-183784: (5.870041603s)
preload_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-183784 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd
preload_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-183784 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd: (47.555381198s)
preload_test.go:68: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-183784 image list
helpers_test.go:175: Cleaning up "test-preload-183784" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p test-preload-183784
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p test-preload-183784: (2.528690495s)
--- PASS: TestPreload (113.79s)

                                                
                                    
x
+
TestScheduledStopUnix (106.45s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-arm64 start -p scheduled-stop-761428 --memory=3072 --driver=docker  --container-runtime=containerd
E1201 20:14:27.087419    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-arm64 start -p scheduled-stop-761428 --memory=3072 --driver=docker  --container-runtime=containerd: (29.996445762s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-761428 --schedule 5m -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1201 20:14:27.421889  184697 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:14:27.422134  184697 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:14:27.422166  184697 out.go:374] Setting ErrFile to fd 2...
	I1201 20:14:27.422187  184697 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:14:27.422449  184697 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:14:27.422731  184697 out.go:368] Setting JSON to false
	I1201 20:14:27.422878  184697 mustload.go:66] Loading cluster: scheduled-stop-761428
	I1201 20:14:27.423329  184697 config.go:182] Loaded profile config "scheduled-stop-761428": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:14:27.423434  184697 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/config.json ...
	I1201 20:14:27.423632  184697 mustload.go:66] Loading cluster: scheduled-stop-761428
	I1201 20:14:27.423778  184697 config.go:182] Loaded profile config "scheduled-stop-761428": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:204: (dbg) Run:  out/minikube-linux-arm64 status --format={{.TimeToStop}} -p scheduled-stop-761428 -n scheduled-stop-761428
scheduled_stop_test.go:172: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-761428 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1201 20:14:27.875577  184786 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:14:27.875694  184786 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:14:27.875704  184786 out.go:374] Setting ErrFile to fd 2...
	I1201 20:14:27.875709  184786 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:14:27.876407  184786 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:14:27.876752  184786 out.go:368] Setting JSON to false
	I1201 20:14:27.876984  184786 daemonize_unix.go:73] killing process 184714 as it is an old scheduled stop
	I1201 20:14:27.877102  184786 mustload.go:66] Loading cluster: scheduled-stop-761428
	I1201 20:14:27.877549  184786 config.go:182] Loaded profile config "scheduled-stop-761428": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:14:27.877665  184786 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/config.json ...
	I1201 20:14:27.877889  184786 mustload.go:66] Loading cluster: scheduled-stop-761428
	I1201 20:14:27.878053  184786 config.go:182] Loaded profile config "scheduled-stop-761428": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
I1201 20:14:27.883625    4305 retry.go:31] will retry after 84.868µs: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.883836    4305 retry.go:31] will retry after 128.985µs: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.887508    4305 retry.go:31] will retry after 236.369µs: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.888650    4305 retry.go:31] will retry after 185.636µs: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.889759    4305 retry.go:31] will retry after 269.9µs: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.890885    4305 retry.go:31] will retry after 792.862µs: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.891952    4305 retry.go:31] will retry after 1.123106ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.894122    4305 retry.go:31] will retry after 1.292632ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.896305    4305 retry.go:31] will retry after 1.572852ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.898433    4305 retry.go:31] will retry after 4.55558ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.903647    4305 retry.go:31] will retry after 6.437299ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.910874    4305 retry.go:31] will retry after 6.827348ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.918113    4305 retry.go:31] will retry after 12.963229ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.931350    4305 retry.go:31] will retry after 17.343038ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.949577    4305 retry.go:31] will retry after 23.530126ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
I1201 20:14:27.973825    4305 retry.go:31] will retry after 29.897848ms: open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/pid: no such file or directory
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-761428 --cancel-scheduled
minikube stop output:

                                                
                                                
-- stdout --
	* All existing scheduled stops cancelled

                                                
                                                
-- /stdout --
E1201 20:14:52.602153    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-761428 -n scheduled-stop-761428
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-761428
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-761428 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1201 20:14:53.821295  185466 out.go:360] Setting OutFile to fd 1 ...
	I1201 20:14:53.821521  185466 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:14:53.821552  185466 out.go:374] Setting ErrFile to fd 2...
	I1201 20:14:53.821571  185466 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1201 20:14:53.822041  185466 root.go:338] Updating PATH: /home/jenkins/minikube-integration/21997-2497/.minikube/bin
	I1201 20:14:53.822467  185466 out.go:368] Setting JSON to false
	I1201 20:14:53.822632  185466 mustload.go:66] Loading cluster: scheduled-stop-761428
	I1201 20:14:53.823302  185466 config.go:182] Loaded profile config "scheduled-stop-761428": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1201 20:14:53.823433  185466 profile.go:143] Saving config to /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/scheduled-stop-761428/config.json ...
	I1201 20:14:53.823714  185466 mustload.go:66] Loading cluster: scheduled-stop-761428
	I1201 20:14:53.823918  185466 config.go:182] Loaded profile config "scheduled-stop-761428": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-761428
scheduled_stop_test.go:218: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p scheduled-stop-761428: exit status 7 (72.439146ms)

                                                
                                                
-- stdout --
	scheduled-stop-761428
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-761428 -n scheduled-stop-761428
scheduled_stop_test.go:189: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-761428 -n scheduled-stop-761428: exit status 7 (68.617804ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-761428" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p scheduled-stop-761428
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p scheduled-stop-761428: (4.847675181s)
--- PASS: TestScheduledStopUnix (106.45s)

                                                
                                    
x
+
TestInsufficientStorage (9.93s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:50: (dbg) Run:  out/minikube-linux-arm64 start -p insufficient-storage-895439 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd
status_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p insufficient-storage-895439 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd: exit status 26 (7.341627441s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"8a5e39a6-d914-4cab-9d00-eab6ebf9f5b6","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[insufficient-storage-895439] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"1a0a4a7a-e468-4699-a8e3-007bcebee922","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=21997"}}
	{"specversion":"1.0","id":"c81e1b59-7531-41c1-a21c-ba5dbeaf0184","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"3dc34a6e-bb39-4bb0-a373-dddda3a7f4c4","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig"}}
	{"specversion":"1.0","id":"8f66c5a8-2b20-4ae8-a8ea-eb8d36c83386","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube"}}
	{"specversion":"1.0","id":"d72d36db-0154-41a9-a1ae-ab5b910c5c79","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"ff899e8f-9fa4-4242-a5ee-1523df26c04e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"e59f16a7-8d12-4eb1-939a-7799e8cf94e7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_STORAGE_CAPACITY=100"}}
	{"specversion":"1.0","id":"b2598cc2-718b-47bb-aff0-9d48e029a5b8","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_AVAILABLE_STORAGE=19"}}
	{"specversion":"1.0","id":"c1b1dba1-4784-4799-b77d-e195d578b378","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"1","message":"Using the docker driver based on user configuration","name":"Selecting Driver","totalsteps":"19"}}
	{"specversion":"1.0","id":"64965ba7-21dc-41e5-b9f8-46a47a0bac17","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"Using Docker driver with root privileges"}}
	{"specversion":"1.0","id":"233c051b-fa7a-4ae3-baa4-eded9f272bac","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"3","message":"Starting \"insufficient-storage-895439\" primary control-plane node in \"insufficient-storage-895439\" cluster","name":"Starting Node","totalsteps":"19"}}
	{"specversion":"1.0","id":"bb6ebd47-8c18-45f7-ab13-876a4c55ebcb","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"5","message":"Pulling base image v0.0.48-1764169655-21974 ...","name":"Pulling Base Image","totalsteps":"19"}}
	{"specversion":"1.0","id":"46fd9c1a-c5f0-4975-99f7-546e949c53ba","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"8","message":"Creating docker container (CPUs=2, Memory=3072MB) ...","name":"Creating Container","totalsteps":"19"}}
	{"specversion":"1.0","id":"ef191bfc-739f-40a6-a65b-b8f21c964bb1","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"Try one or more of the following to free up space on the device:\n\n\t\t\t1. Run \"docker system prune\" to remove unused Docker data (optionally with \"-a\")\n\t\t\t2. Increase the storage allocated to Docker for Desktop by clicking on:\n\t\t\t\tDocker icon \u003e Preferences \u003e Resources \u003e Disk Image Size\n\t\t\t3. Run \"minikube ssh -- docker system prune\" if using the Docker container runtime","exitcode":"26","issues":"https://github.com/kubernetes/minikube/issues/9024","message":"Docker is out of disk space! (/var is at 100% of capacity). You can pass '--force' to skip this check.","name":"RSRC_DOCKER_STORAGE","url":""}}

                                                
                                                
-- /stdout --
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-895439 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-895439 --output=json --layout=cluster: exit status 7 (302.528019ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-895439","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","Step":"Creating Container","StepDetail":"Creating docker container (CPUs=2, Memory=3072MB) ...","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-895439","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1201 20:15:51.449836  187297 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-895439" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig

                                                
                                                
** /stderr **
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-895439 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-895439 --output=json --layout=cluster: exit status 7 (295.028055ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-895439","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-895439","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1201 20:15:51.746653  187362 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-895439" does not appear in /home/jenkins/minikube-integration/21997-2497/kubeconfig
	E1201 20:15:51.756135  187362 status.go:258] unable to read event log: stat: stat /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/insufficient-storage-895439/events.json: no such file or directory

                                                
                                                
** /stderr **
helpers_test.go:175: Cleaning up "insufficient-storage-895439" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p insufficient-storage-895439
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p insufficient-storage-895439: (1.984517755s)
--- PASS: TestInsufficientStorage (9.93s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (321.51s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.35.0.3076372189 start -p running-upgrade-960321 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1201 20:24:27.087913    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.35.0.3076372189 start -p running-upgrade-960321 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (30.539861921s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-arm64 start -p running-upgrade-960321 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1201 20:24:52.599261    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:26:46.973629    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:28:10.052346    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:29:27.087915    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-arm64 start -p running-upgrade-960321 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m37.912025548s)
helpers_test.go:175: Cleaning up "running-upgrade-960321" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p running-upgrade-960321
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p running-upgrade-960321: (1.950429052s)
--- PASS: TestRunningBinaryUpgrade (321.51s)

                                                
                                    
x
+
TestMissingContainerUpgrade (188.27s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
=== PAUSE TestMissingContainerUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestMissingContainerUpgrade
version_upgrade_test.go:309: (dbg) Run:  /tmp/minikube-v1.35.0.3376685709 start -p missing-upgrade-847129 --memory=3072 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:309: (dbg) Done: /tmp/minikube-v1.35.0.3376685709 start -p missing-upgrade-847129 --memory=3072 --driver=docker  --container-runtime=containerd: (1m16.264649654s)
version_upgrade_test.go:318: (dbg) Run:  docker stop missing-upgrade-847129
version_upgrade_test.go:323: (dbg) Run:  docker rm missing-upgrade-847129
version_upgrade_test.go:329: (dbg) Run:  out/minikube-linux-arm64 start -p missing-upgrade-847129 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:329: (dbg) Done: out/minikube-linux-arm64 start -p missing-upgrade-847129 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (1m36.426511468s)
helpers_test.go:175: Cleaning up "missing-upgrade-847129" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p missing-upgrade-847129
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p missing-upgrade-847129: (2.317120692s)
--- PASS: TestMissingContainerUpgrade (188.27s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-178134 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:108: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p NoKubernetes-178134 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd: exit status 14 (94.298093ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-178134] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=21997
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/21997-2497/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/21997-2497/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (40.59s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:120: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-178134 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:120: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-178134 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (39.768006022s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-178134 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (40.59s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (8.81s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:137: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-178134 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:137: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-178134 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (6.346520456s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-178134 status -o json
no_kubernetes_test.go:225: (dbg) Non-zero exit: out/minikube-linux-arm64 -p NoKubernetes-178134 status -o json: exit status 2 (337.584234ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-178134","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:149: (dbg) Run:  out/minikube-linux-arm64 delete -p NoKubernetes-178134
no_kubernetes_test.go:149: (dbg) Done: out/minikube-linux-arm64 delete -p NoKubernetes-178134: (2.128053021s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (8.81s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (8.51s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:161: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-178134 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
E1201 20:16:46.971268    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
no_kubernetes_test.go:161: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-178134 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (8.51102541s)
--- PASS: TestNoKubernetes/serial/Start (8.51s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads
no_kubernetes_test.go:89: Checking cache directory: /home/jenkins/minikube-integration/21997-2497/.minikube/cache/linux/arm64/v0.0.0
--- PASS: TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.27s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-178134 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-178134 "sudo systemctl is-active --quiet service kubelet": exit status 1 (268.131554ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.27s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (1s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:194: (dbg) Run:  out/minikube-linux-arm64 profile list
no_kubernetes_test.go:204: (dbg) Run:  out/minikube-linux-arm64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (1.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (1.32s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:183: (dbg) Run:  out/minikube-linux-arm64 stop -p NoKubernetes-178134
no_kubernetes_test.go:183: (dbg) Done: out/minikube-linux-arm64 stop -p NoKubernetes-178134: (1.317236829s)
--- PASS: TestNoKubernetes/serial/Stop (1.32s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (7.45s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:216: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-178134 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:216: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-178134 --driver=docker  --container-runtime=containerd: (7.447308125s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (7.45s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.38s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-178134 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-178134 "sudo systemctl is-active --quiet service kubelet": exit status 1 (377.673618ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.38s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (1.01s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (1.01s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (303.75s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.35.0.3337057654 start -p stopped-upgrade-632869 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1201 20:19:27.087840    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.35.0.3337057654 start -p stopped-upgrade-632869 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (32.748805872s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.35.0.3337057654 -p stopped-upgrade-632869 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.35.0.3337057654 -p stopped-upgrade-632869 stop: (1.371012399s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-arm64 start -p stopped-upgrade-632869 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1201 20:19:52.599549    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:20:50.172466    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/addons-569760/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:21:46.971592    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-019259/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1201 20:22:55.667086    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-arm64 start -p stopped-upgrade-632869 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m29.630071666s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (303.75s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (1.99s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-arm64 logs -p stopped-upgrade-632869
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-arm64 logs -p stopped-upgrade-632869: (1.985214944s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (1.99s)

                                                
                                    
x
+
TestPause/serial/Start (79.38s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -p pause-916050 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd
E1201 20:29:52.600829    4305 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/21997-2497/.minikube/profiles/functional-428744/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
pause_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -p pause-916050 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd: (1m19.382822591s)
--- PASS: TestPause/serial/Start (79.38s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (8.54s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-arm64 start -p pause-916050 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-arm64 start -p pause-916050 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (8.520278816s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (8.54s)

                                                
                                    
x
+
TestPause/serial/Pause (0.91s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-916050 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.91s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.43s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p pause-916050 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p pause-916050 --output=json --layout=cluster: exit status 2 (425.010118ms)

                                                
                                                
-- stdout --
	{"Name":"pause-916050","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 7 containers in: kube-system, kubernetes-dashboard, istio-operator","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-916050","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.43s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.86s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-arm64 unpause -p pause-916050 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.86s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (1.18s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-916050 --alsologtostderr -v=5
pause_test.go:110: (dbg) Done: out/minikube-linux-arm64 pause -p pause-916050 --alsologtostderr -v=5: (1.178772628s)
--- PASS: TestPause/serial/PauseAgain (1.18s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (3.3s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p pause-916050 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p pause-916050 --alsologtostderr -v=5: (3.303563107s)
--- PASS: TestPause/serial/DeletePaused (3.30s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.2s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
pause_test.go:168: (dbg) Run:  docker ps -a
pause_test.go:173: (dbg) Run:  docker volume inspect pause-916050
pause_test.go:173: (dbg) Non-zero exit: docker volume inspect pause-916050: exit status 1 (30.905787ms)

                                                
                                                
-- stdout --
	[]

                                                
                                                
-- /stdout --
** stderr ** 
	Error response from daemon: get pause-916050: no such volume

                                                
                                                
** /stderr **
pause_test.go:178: (dbg) Run:  docker network ls
--- PASS: TestPause/serial/VerifyDeletedResources (0.20s)

                                                
                                    

Test skip (34/321)

Order skiped test Duration
5 TestDownloadOnly/v1.28.0/cached-images 0
6 TestDownloadOnly/v1.28.0/binaries 0
7 TestDownloadOnly/v1.28.0/kubectl 0
14 TestDownloadOnly/v1.34.2/cached-images 0
15 TestDownloadOnly/v1.34.2/binaries 0
16 TestDownloadOnly/v1.34.2/kubectl 0
22 TestDownloadOnly/v1.35.0-beta.0/preload-exists 0.16
25 TestDownloadOnly/v1.35.0-beta.0/kubectl 0
29 TestDownloadOnlyKic 0.44
31 TestOffline 0
42 TestAddons/serial/GCPAuth/RealCredentials 0.01
49 TestAddons/parallel/Olm 0
56 TestAddons/parallel/AmdGpuDevicePlugin 0
60 TestDockerFlags 0
64 TestHyperKitDriverInstallOrUpdate 0
65 TestHyperkitDriverSkipUpgrade 0
112 TestFunctional/parallel/MySQL 0
116 TestFunctional/parallel/DockerEnv 0
117 TestFunctional/parallel/PodmanEnv 0
130 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0
131 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
132 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0
207 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL 0
211 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv 0
212 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv 0
224 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig 0
225 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
226 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS 0
261 TestGvisorAddon 0
283 TestImageBuild 0
284 TestISOImage 0
348 TestChangeNoneUser 0
351 TestScheduledStopWindows 0
353 TestSkaffold 0
x
+
TestDownloadOnly/v1.28.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.34.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.34.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.34.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/preload-exists (0.16s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/preload-exists
I1201 19:06:36.081554    4305 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
W1201 19:06:36.190149    4305 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
W1201 19:06:36.237615    4305 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
aaa_download_only_test.go:113: No preload image
--- SKIP: TestDownloadOnly/v1.35.0-beta.0/preload-exists (0.16s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.35.0-beta.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0.44s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:231: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p download-docker-916539 --alsologtostderr --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:248: Skip for arm64 platform. See https://github.com/kubernetes/minikube/issues/10144
helpers_test.go:175: Cleaning up "download-docker-916539" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p download-docker-916539
--- SKIP: TestDownloadOnlyKic (0.44s)

                                                
                                    
x
+
TestOffline (0s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:35: skipping TestOffline - only docker runtime supported on arm64. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestOffline (0.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/RealCredentials (0.01s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/RealCredentials
addons_test.go:759: This test requires a GCE instance (excluding Cloud Shell) with a container based driver
--- SKIP: TestAddons/serial/GCPAuth/RealCredentials (0.01s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:483: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestAddons/parallel/AmdGpuDevicePlugin (0s)

                                                
                                                
=== RUN   TestAddons/parallel/AmdGpuDevicePlugin
=== PAUSE TestAddons/parallel/AmdGpuDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/AmdGpuDevicePlugin
addons_test.go:1033: skip amd gpu test on all but docker driver and amd64 platform
--- SKIP: TestAddons/parallel/AmdGpuDevicePlugin (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:37: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:101: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctional/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestISOImage (0s)

                                                
                                                
=== RUN   TestISOImage
iso_test.go:36: This test requires a VM driver
--- SKIP: TestISOImage (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
Copied to clipboard